3D Dimensional Analysis & Quality Control
Mixed Reality: The Most Significant Advance in Large-Scale 3D Metrology Since the Laser Tracker
3D Dimensional Analysis & Quality Control
Mixed Reality: The Most Significant Advance in Large-Scale 3D Metrology Since the Laser Tracker
Boasting high accuracy over large measurement ranges, laser trackers have earned their reputation as revolutionary process advances for large-scale 3D metrology for aerospace, ground transportation, energy, and naval manufacturing, among others. Optimizing laser trackers to improve efficiencies has included working with two operators, displaying geometries on measured pieces with external projectors, and using a mobile phone as a remote control.
These techniques have now reached their limits.
Fortunately, the technology to surpass these limits already exists. With its holographic display, tracking systems, cameras, 3D scanner, and powerful software, the Microsoft HoloLens 2 smart glasses together with the PolyWorks|AR™ app from InnovMetric boost large-scale measurement performance by an order of magnitude. With mixed reality, large-scale metrology will never be the same:
- Operators always measure the right features -- guidance graphics are superimposed on the measured piece
- Large screens and external projectors -- unnecessary
- Operators can work hands-free
- Changing inspection tasks is one gesture away.
Small Versus Large
For objects smaller than two meters, 3D measurement is pretty straightforward. Operators know their position with respect to what they measure at all times. They can generally recognize and relate what they see on a computer screen with a location on the piece being measured. And they know how to reach the next measurement feature following a predefined sequence, quickly returning to the computer if they require a mouse or keyboard to interact with 3D measurement software.
Increase the size of the measured piece beyond 5 meters, though, and measurement performance issues grow as well. Operators may have difficulty understanding their position in space and distinguishing a measurement target among a multiplicity of other features. Establishing the correspondence between a computer display and a physical location becomes increasingly hard. Operators may need to cover several meters to reach the next measurement feature. And returning to the computer operating the 3D measurement software increases task time.
With laser trackers on the scene, customers quickly began adapting their large-scale measurement techniques. Such tasks often required two operators: one physically performing measurements, the other at the computer operating measurement software to invoke functionalities and address questions. Other adaptations included redirecting the computer screen to a large monitor, projection screen, or blank wall, providing improved visuals for operators.
While better, these initial solutions were imperfect. Needing two operators doubles the human cost of measurement tasks, and maintaining necessary eye contact with a fixed screen while moving within a large assembly is very difficult.
Improvements Evolve
Fortunately, two technologies allowed operators to make major performance gains in large-scale 3D measurement tasks.
The first were projection technologies: laser projectors that project 3D contours using a laser beam and area-based projectors that project images. Both types can project guidance geometry and measurement results on the surface of measured pieces, facilitating the execution of measurement sequences and analysis of measurement results.
However, projector use can be difficult and constraining. Localizing the projector properly into the coordinate system of the measured piece poses difficulties. And a projector can only reach surfaces visible from its point of view, which could require moving the projector into multiple locations or purchasing multiple projectors to efficiently handle large assemblies. New assignments then require new setups.
The second technology that improved large-scale measurement tasks was the mobile phone, quickly transforming it into a remote control via specialized apps. Operators could not only visualize guidance geometry and measurement results on the mobile phone screen, but they could also easily match a feature on the screen to a location on the measured piece when located close to the 3D measurement device and using a 3D measurement software that automatically adapts its display to the 3D measurement device position.
Users also could interact with the computer remotely, enabling large-scale measurement by a single operator in many cases.
Furthermore, 3D visual information on mobile phone remote controls is always available, as there are no shadow areas that may occur with projectors.
Yet mobile phones also have their limitations. Many lack the sensors to measure their orientation in 3D space. Screen displays only correspond to the operator’s view of the measured piece when the telephone is close to the 3D measurement device. Operators must also carry the telephone while measuring. If climbing a ladder, for example, an operator needs both hands to remain safe.
Mixed Reality Goes Beyond
Emerging mixed reality technology is transforming large-scale metrology by offering the same benefits as projectors and remote controls without the limitations while delivering several additional powerful capabilities.
Compared to using projectors or mobile phones, Microsoft’s HoloLens 2 smart glasses showcase numerous advantages. They can:
- Track changes of position and orientation (6 Degrees-Of-Freedom tracking)
- Track eye movements
- Offer holographic projection technology, projecting graphical information on a measured piece once the device is localized with respect to the piece coordinate system
- Recognize hand gestures via several cameras and embedded software
- Scan the surrounding environment
These smart-glasses fundamentals enable developing mixed reality apps interconnected to 3D measurement software for projection and remote-control functionalities. Stable geometries projected on measured pieces guide and review measurement results, regardless of the operator’s position and without any shadow area. There is no fixed setup; operators can quickly switch from one piece to another, using instinctual gestures to interact with the user interface. Measurement is safer, as operators work hands-free.
Sensors of mixed reality devices also unlock major innovations not available from projectors or remote controls. Since the operator’s position and point of view is always known, redirecting a lost laser-tracker beam to the operator is easy. So is changing operator location when a large displacement is required. Controlling a cursor and creating a 3D point at a specific location is easily done with the head and eyes, as is making an annotation on a color map, reporting a defect, or defining a reference point for an alignment.
Operators can also use their hands to manipulate 3D geometry within the piece coordinate system. They can align 3D holograms to localize the mixed reality device with respect to the piece, and automatically capture mixed reality images that combine reality and holograms to ensure the traceability of manual measurement operations.
Improving large-scale metrology tasks by an order of magnitude
Mixed-reality display technology provides innovative visual tools such as instructions, overlays, and holograms to improve both operator performance and measurement results. Experience getting visual guidance and feedback in front of your eyes that assure measuring correctly each time. See quality improve.