History of Automated In-Line Accurate Measurement (March 1, 2020)

Blog 1 - 2020
Looking Back ...

Prior to the advent of automated in-line measurement, the traditional dimensional quality control strategy in the automotive body shop relied on sampling production with off-line CMM machines in temperature-controlled measurement rooms. The metrology science and techniques for touch probe contact measurement were developed in the 1970s by metrology engineers in collaboration with the CMM companies. The quality engineers operating the CMM machines were highly trained metrology specialists. The absolute accuracy of the typical CMM machine in the automotive body shop could reach 0.010mm in a local area, but when assessed throughout the machine volume, a more typical accuracy of 0.100mm maximum error was more often the reality with dual arm configurations.

When Perceptron introduced plant floor hardened automated in-line measurement in the mid-1980s, the focus was on 100% measurement data and statistical process techniques for process variation reduction. The repeatability of the Perceptron technique was typically less than 0.100mm 3-Sigma. The systems were good for relative measurement typically achieving relative accuracy error on the order of 10% due to crudely measured relation from sensor coordinates to part coordinates.

And the debate over 100% vs sampling began. One big question was what to do with the overload of data? Another was how much is enough accuracy? Data confidence also became a big challenge as the laser optical technique applying image processing were subject to influences that affected the results differently than the CMM touch probes. The desire to have traceability of the in-line measurements drove a process of correlating and offsetting the in-line measurements relative to the CMM and this became a major effort for the quality engineers in the measurement rooms.

In the late 1980s, Perceptron invented and patented a technique for calibration of the in-line measurement stations directly into absolute coordinates. The technique made use of theodolites referenced to the part coordinate origin and a calibration target measurable by both the theodolites and the measurement sensor’s laser. The relation from sensor coordinates into absolute part coordinates was generated for each sensor and stored and applied to the measurements. This technique typically achieved absolute accuracy within 0.250mm when applied to fixed mounted sensors. This reduced the CMM correlation and offset process, but the differences between optical and touch probe techniques remained.

In the early 1990s, interest in flexible automation and measuring with robots positioning sensors, rather than fixed mounted sensors, for each checkpoint was growing—particularly in Japan and Korea. This was driven partly by the desire to run multiple models on a single line rather than single model dedicated tooling.

Error from robot repeatability and thermal drift had to be overcome, and Perceptron and Nissan developed high-accuracy measurement robots with rectilinear axes to allow straight forward linear thermal drift error correction. The measurement data was processed to optimize the numerically controlled tooling—an early instance of Industry 4.0 level of automation and information exchange. This was followed by techniques for applying kinematic model-based thermal compensation to standard industrial robots to reduce measurement error caused by robot thermal drift. Absolute accuracy was initially still achieved by reference measurement techniques at each checkpoint, such as the theodolite or eventually laser tracker, but results were never as accurate as with fixed-mounted sensors.

During the early 2000s, techniques to calibrate robots into absolute coordinates and sustain that calibration were developed and refined with a goal to simplify the use of measurement robots and increase the flexibility of the in-line measurement stations. The robot kinematic models and compensation techniques became more sophisticated and accurate. The industry-leading techniques developed by Perceptron to compensate for the absolute error of the robot TCP position and the relation from sensor coordinate to TCP coordinate to part coordinate could be relied on to achieve an absolute volumetric accuracy approaching 0.250mm. Standards were also developed and adopted for validating and comparing volumetric accuracy of the automated systems, such as the ISO 10360-8.

More recently, Perceptron has pioneered major advances such as optical measurement techniques and 3D point cloud laser sensors, such as the Helix sensor family. Helix was developed to produce measurements that exactly match the CMM touch probe techniques, virtually eliminating this long-standing correlation error factor. Perceptron developed self-learning software for compensating measurements such that plant floor temperature-induced dimensional changes of the measured part do not influence the measurement results. Software for split cycle configurations where different checkpoints are measured on different cycles have been introduced to maximize the in-line checkpoint coverage. And off-line programming techniques, including the use of Digital Twins to fully simulate automated systems, have simplified the programming and maintenance of the automated systems.