Projector-based Augmented Reality for Telemaintenance Support

Projector-based Augmented Reality for Telemaintenance Support

Proceedings,16th IFAC Symposium on Proceedings,16th IFAC Symposium on Information Control Problems in Manufacturing Proceedings,16th IFAC Symposium on...

1MB Sizes 0 Downloads 19 Views

Proceedings,16th IFAC Symposium on Proceedings,16th IFAC Symposium on Information Control Problems in Manufacturing Proceedings,16th IFAC Symposium on Available online at www.sciencedirect.com Information Control Problems in Manufacturing Proceedings,16th IFAC Symposium on Bergamo, Italy, June 11-13, 2018 Information Control Problems in Manufacturing Proceedings,16th IFAC Symposium on Bergamo, Italy, June 11-13, 2018 Information Control Problems in Manufacturing Bergamo, Italy, JuneProblems 11-13, 2018 Information Control in Manufacturing Bergamo, Italy, June 11-13, 2018 Bergamo, Italy, June 11-13, 2018

ScienceDirect

IFAC PapersOnLine 51-11 (2018) 502–507

Projector-based Projector-based Augmented Augmented Reality Reality for for Telemaintenance Telemaintenance Support Support Projector-based Augmented Reality for Telemaintenance Support Florian Leutert*, Klaus Schilling* Projector-based Augmented Reality for Telemaintenance Florian Leutert*, Klaus Schilling* Projector-based Augmented Reality for Telemaintenance Support Support 

Florian  Klaus Florian Leutert*, Leutert*, Klaus Schilling* Schilling* Florian Leutert*, Klaus Schilling*  e.V., ** Zentrum für Telematik Germany  e.V., Würzburg, Zentrum für Telematik Würzburg, Germany  * Zentrum für Telematik e.V., Würzburg, (e-mail: {florian.leutert, klaus.schilling}@telematik-zentrum.de) Zentrum für Telematik e.V., Würzburg, Germany Germany (e-mail:* {florian.leutert, klaus.schilling}@telematik-zentrum.de) Zentrum für Telematik e.V., Würzburg, Germany (e-mail:* {florian.leutert, {florian.leutert, klaus.schilling}@telematik-zentrum.de) (e-mail: klaus.schilling}@telematik-zentrum.de) (e-mail: {florian.leutert, klaus.schilling}@telematik-zentrum.de) Abstract: For For efficient efficient telemaintenance telemaintenance support, support, teleoperator teleoperator and and service service technician technician need need aa common common Abstract: Abstract: For efficient telemaintenance support, teleoperator and service technician need a common spatial awareness: both need to be able to observe the support environment and to point out to each other Abstract: For efficient telemaintenance support, the teleoperator and serviceand technician needto aeach common spatial awareness: both need to be able to observe support environment to point out other Abstract: For efficient telemaintenance support, teleoperator andtoservice technician need aeach common spatial awareness: both need to be able to observe the support environment and to point out to other individual components of machinery where maintenance steps are be performed. We propose the use spatial awareness: both of need to be ablewhere to observe the support and to point to each individual components machinery maintenance stepsenvironment are to be performed. Weout propose theother use spatial awareness: both Augmented need to be able to observe the that support environment and of to point out to each other individual components of machinery where maintenance steps are to be performed. We propose the use of a projector-based Reality system enables this kind remote bi-directional individual components Augmented of machineryReality where maintenance are tothis be performed. We propose the use of a projector-based system that steps enables kind of remote bi-directional individual components of machinery where maintenance steps are tothis be performed. We propose the use of a projector-based Augmented Reality system that enables kind of remote bi-directional communication with spatial reference. The system requires only a camera and a projector and allows an of a projector-based Augmented Reality systemrequires that enables this kind remote and bi-directional communication with spatial reference. The system only a camera and aofprojector allows an of a projector-based Augmented Reality system that enables this using kind ofprojector remote and bi-directional communication with spatial reference. The system requires only a camera and a allows an operator to precisely place virtual annotations in the remote support area the projector, highlighting communication with place spatialvirtual reference. The system only a camera and the a projector allows an operator to precisely annotations in therequires remote support area using projector,and highlighting communication with spatialvirtual reference. The system requires only a camera and a projector and allows an operator to precisely place annotations in the remote support area using the projector, highlighting critical machine components and making service steps and instructions easily understandable to the operator to precisely place virtual in the steps remoteand support area using the projector, highlighting critical machine components andannotations making service instructions easily understandable to the operator to precisely place virtual annotations in the remote support area using the projector, critical machine components and making service steps and instructions easily understandable to the the technician. The performance performance of this Augmented Reality support system was evaluated in aahighlighting large-scale critical machine components and making service steps and instructions easily understandable to technician. The of this Augmented Reality support system was evaluated in large-scale critical machine components and making serviceReality steps support and instructions easily understandable to the technician. The performance of this Augmented system was evaluated in a large-scale user study with prospective service technicians. technician. The prospective performanceservice of thistechnicians. Augmented Reality support system was evaluated in a large-scale user study with technician. The prospective performanceservice of thistechnicians. Augmented Reality support system was evaluated in a large-scale user study study with with user prospective service technicians. © 2018, IFAC (International Federation ofReality, Automatic Control) Hosting by Elsevier Ltd. Allcooperative rights reserved. Keywords: Projector-based Augmented Telemaintenance, computer supported work. user study with prospective Augmented service technicians. Keywords: Projector-based Reality, Telemaintenance, computer supported cooperative work. Keywords: Projector-based Augmented Reality, Telemaintenance, computer supported cooperative work. Keywords: Projector-based Augmented Reality, Telemaintenance, computer supported cooperative work.  Keywords: Projector-based Augmented Reality, Telemaintenance, computer supported cooperative work.  surroundings  surroundings and and point point out out critical critical spots spots and and conditions. conditions. The The  1. 1. INTRODUCTION INTRODUCTION  surroundings and point out critical spots and conditions. remote expert can then decide on a course to surroundings and point out critical spots and conditions. The remote expert can then decide on a course of of action; action;The to 1. INTRODUCTION surroundings and point out critical spots and conditions. The 1. INTRODUCTION remote expert can then decide on a course of action; to present the solution steps to the technician, he can make The leads can then a course he of can action; to INTRODUCTION present expert the solution stepsdecide to theontechnician, make The digitization digitization of of1. manufacturing manufacturing leads to to increasingly increasingly remote remote expert can then decide on a image, course of action; to present steps the make additions the for example The digitization of leads to complex and production These additionsthe insolution the presented presented camera image, he for can example thein solution steps to to camera the technician, technician, he can make The digitization of manufacturing manufacturing leadsmachinery. to increasingly increasingly complex and inter-connected inter-connected production machinery. These present present the solution steps to the technician, he can make additions in the presented camera image, for example highlighting machine components, drawing symbols or The digitization of manufacturing leads to increasingly complex inter-connected production need be available, thus the presented camera drawing image, for example highlightingin machine components, symbols or complex andsystems inter-connected production machinery. These productionand systems need to to production be highly highlymachinery. available, These thus additions additions inannotations; the presented camera image, for example highlighting machine components, drawing symbols or adding text these additions are then transmitted complex and inter-connected production machinery. These production systems need to be highly available, thus requiring regular maintenance and requiring service highlighting machine components, drawing symbols or adding text annotations; these additions are then transmitted production systems maintenance need to be and highlyrequiring available,service thus highlighting requiring regular machine components, drawing symbols or annotations; these then back the environment, where AR-system –– using production systems need to intervene be and highlyin case available, thus adding requiring service technicians to to text annotations; these additions additions are then transmitted transmitted back to to text the work work environment, where the theare AR-system using requiring regular maintenance and requiring requiring service adding technicians regular to be be able ablemaintenance to quickly quickly intervene in case of of failures. failures. adding text annotations; these additions are then transmitted back to environment, – the -- draws them exactly the surface requiring the regular maintenance and requiring service technicians to be intervene in of However, complexity of systems means back to the the work work environment, where the AR-system – using using the projector projector draws them at at where exactlythe theAR-system surface positions positions technicians to increasing be able able to to quickly quickly intervene in case case of failures. failures. However, the increasing complexity of those those systems means indicated backprojector to theby work environment, where the AR-system – using the draws them at exactly the surface positions the teleoperator in the camera image. The technicians to increasing bethe able to quickly intervene in case of failures. However, the complexity of those systems means that often only manufacturer has detailed knowledge of the projector draws them at exactly surface positions indicated by -the teleoperator in the the camera image. The However, complexity those systems means that often the onlyincreasing the manufacturer hasofdetailed knowledge of indicated the projector -the draws them at exactly the surface positions by teleoperator in the camera image. technician thus simply sees the projected annotations at the However, the increasing complexity ofdetailed those systems means that often only the manufacturer has knowledge of parts, structure and maintenance steps. When facing a new technician by thusthe simply sees the in projected annotations the teleoperator the camera image.at The The that only the has detailed of indicated parts,often structure and manufacturer maintenance steps. When knowledge facing a new indicated by the teleoperator in the camera image. The technician thus simply sees the projected annotations at the correct locations in the work environment (Fig. 1). Using this that often only maintained the manufacturer has to detailed knowledge of correct locations parts, structure and steps. facing aa new machine to or overcome aa system thus simply sees the projected (Fig. annotations at this the in the work environment 1). Using parts, structure and maintenance maintenance steps. When facing new technician machine to be be maintained or trying trying to When overcome system technician thus simply sees the projected annotations at correct locations in (Fig. Using AR-system, the expert can out things to the parts, structure and maintenance steps. When facing a new machine to maintained or to aa system failure technicians thus locations in the the work work environment (Fig. 1). Using this AR-system, the remote remote expertenvironment can then then point point out1). things to this the machine to be beencountered maintained before, or trying tryingservice to overcome overcome system failure never never encountered before, service technicians thus correct correct locations in the work environment (Fig. 1). Using this AR-system, the remote expert can then point out things to the technician with reference to the actual work environment, machine to be maintained or trying to overcome a system failure never encountered before, service technicians thus have to consult support staff of the manufacturer for AR-system, the remote expert can then point out things to the technician with reference to the actual work environment, failure encountered technicians thus have tonever consult support before, staff ofservice the manufacturer for showing AR-system, thestep-by-step remote expert can then point out things to to the with reference to work environment, reference failure never encountered before, service manufacturer technicians thus technician have to staff troubleshooting and instructions. with reference instructions to the the actual actualwith work environment, showing him him step-by-step instructions with reference to the have to consult consult support staff of of the the manufacturer for for technician troubleshooting andsupport instructions. with reference to thebeside actualwith work environment, showing him step-by-step instructions reference to machinery as if he were standing him. have to consult support staff of the manufacturer for technician troubleshooting and instructions. showing instructions the machineryhim as ifstep-by-step he were standing beside with him. reference to the troubleshooting and instructions. showing him step-by-step instructions with Since time and cost make traveling to the support site machinery as if he were standing beside him. troubleshooting and instructions. Since time and cost make traveling to the support site machinery as if he were standing beside him. reference to the Since time and cost make to site prohibitive, will often simply Since time the andservice cost technician make traveling traveling to the the support site machinery as if he were standing beside him. prohibitive, the service technician will most most oftensupport simply call call Since timeexpert, andservice cost make traveling to telephone the support site prohibitive, the technician will most often simply call the remote describe the problem via and ask prohibitive, the service technician will most often simply the remote expert, describe the problem via telephone and call ask prohibitive, the service technician will simple, most often simply call the remote expert, describe the problem via telephone and for instructions. While inherently this type of the expert, describe the problemsimple, via telephone and ask ask for remote instructions. While inherently this type of the remote expert, describe the problem via telephone andfirst ask for instructions. While inherently simple, this type of telemaintenance support has apparent limits. Both parties for instructions.support Whilehas inherently simple,Both thisparties typefirst of telemaintenance apparent limits. for instructions. While inherently simple, thisparties type of telemaintenance support has apparent limits. Both first have to assert the specific machinery component to be telemaintenance has apparent limits.component Both partiestofirst have to assert support the specific machinery be telemaintenance support has apparent limits. Both parties first have to the machinery component to maintained, understand the description of have to assert assert the specific specific machinery component to be be maintained, understand the other’s other’s description of the the problem problem have to assert the the specific machinery component to be maintained, understand the other’s description of the problem and need to transfer description of solution steps back maintained, other’s description the problem and need to understand transfer thethe description of solutionofsteps back to to maintained, understand the other’s description ofsteps the problem and need the of to the production environment. If and need to to transfer transfer the description description of solution solution steps back backand to the surrounding surrounding production environment. If technician technician and and need to transfer the description of solution steps back to the surrounding production environment. If technician and expert were to share the same environment, the expert could the surrounding production technician and expert were to share the sameenvironment. environment,Ifthe expert could the surrounding production environment. If technician and expert were to share the same environment, the expert could see the actual production surroundings, verify machinery expert to share the same environment,verify the expert could see thewere actual production surroundings, machinery expert were to share the same environment, the expert could see actual production surroundings, verify machinery parts, analyze problem provide see the actualthe production surroundings, verifystep-by-step machinery parts,the analyze the problem and and provide detailed detailed step-by-step see the actual production surroundings, verify machinery parts, analyze the problem and provide detailed step-by-step instructions, guiding the technician during the procedure with parts, analyzeguiding the problem and provide detailed step-by-step instructions, the technician during the procedure with parts, analyze the problem and provide detailed step-by-step instructions, guiding the technician during the procedure reference to the local components. instructions, guiding technician during the procedure with with reference to the local the components. instructions, guiding the technician during the procedure with Fig. 1. The projector-based AR-support system projecting reference to the local components. Fig. 1. The projector-based AR-support system projecting reference to the local components. We the use aa projector-based reference to the components. The projector-based AR-support data onto control We propose propose thelocal use of of projector-based Augmented Augmented Reality Reality Fig. Fig. 1. Theaa robot projector-based AR-support system system projecting projecting data 1. onto robot control cabinet. cabinet. We propose the use of a projector-based Augmented Reality (AR) support system that enables the expert to do just that, Fig. 1. The projector-based AR-support system projecting data onto a robot control cabinet. We propose use ofthat a projector-based Augmented Reality (AR) supportthe system enables the expert to do just that, data onto a robot control cabinet. We propose theinuseaddition ofthat a projector-based Augmented Reality (AR) support the to but to the onto a robot control cabinet. If (AR) support system system that enables enables the expert expert connection, to do do just just that, that, but remotely: remotely: in addition to aa telephone telephone connection, the data If additional additional information information on on the the inspected inspected machinery machinery is is (AR) support system that enables the expert to composed do just that, but remotely: in addition to a telephone connection, service technician sets up a portable AR system of additional information on the inspected machinery is available (for example in the form of annotated CAD data) but remotely: in sets addition to a telephone connection, the service technician up a portable AR system composedthe of If If additional information on the inspected machinery is available (for example in the form of annotated CAD data) but remotely: in sets additiona The to a system telephone connection, the service technician portable AR of aservice and quickly initializes, additional information on the inspected machinery is available (for in data) our AR can also used automatic technician sets up up a The portable AR system system composed of If a camera camera and projector. projector. system quicklycomposed initializes, (for example example in the the form ofbe annotated CAD data) our proposed proposed AR system system canform alsoof beannotated used for forCAD automatic service technician sets up a The portable AR system composed of available acapturing camera and projector. system quickly initializes, the work environment with a Structured Light scan (forofexample in components the form ofbeor annotated CAD data) our proposed AR system can also used for automatic highlighting machine display of service acapturing camera the and projector. The with system quickly Light initializes, work environment a Structured scan available our proposedofAR systemcomponents can also beorused for of automatic highlighting machine display service acapturing camera the and projector. The with systemStructured quickly initializes, and the captured After this proposedIn AR system can also beor used fortheof automatic highlighting of machine components display service instructions. order to do that, the location of portable capturing the work work environment with aainformation. Structured Light Light scan and transmitting transmitting theenvironment captured surface surface information. Afterscan this our instructions. In order to do that, the location of the portable highlighting of machine components or display of service capturing the work environment with a Structured Light scan and transmitting the captured surface information. After enabling step, transmits video to the of machine components or display of service instructions. In to the the AR needs to be with to and transmitting thecamera captured surface live information. After this enabling step, the the camera transmits live video data data to this the highlighting instructions. In order order to do do that, the location location ofrespect the portable portable AR support support system system needs to that, be determined determined withof respect to the the and transmitting thecamera captured surface live information. After this enabling step, the transmits video data to the remote expert, enabling the technician to present his real instructions. In order to do that, the location of the portable AR support system needs to be determined with respect to the enabling step, the camerathetransmits livetovideo datahisto real the AR support system needs to be determined with respect remote expert, enabling technician present to the enabling step, the camera transmits live video data to the remote enabling the to present his real remote expert, expert, enabling the technician technician to of present his Control) real AR support system needs to be determined with respect to the 2405-8963 © 2018, enabling IFAC (International Federation Automatic remote expert, the technician to present his real 509Hosting by Elsevier Ltd. All rights reserved. Copyright © 2018 IFAC Copyright 2018 responsibility IFAC 509Control. Peer review©under of International Federation of Automatic Copyright © 2018 IFAC 509 10.1016/j.ifacol.2018.08.368 Copyright © 2018 IFAC 509 Copyright © 2018 IFAC 509

IFAC INCOM 2018 Bergamo, Italy, June 11-13, 2018

Florian Leutert et al. / IFAC PapersOnLine 51-11 (2018) 502–507

CAD-model data (see section 3.4). Afterwards, highlights of components or assembly instructions can be displayed without requiring explicit teleoperator intervention, for example for realizing training maintenance systems or using the projector-based system as interactive instruction manual.

503

use this information to correctly project annotations of a remote expert. Our system improves on their approach in some points: first, we don’t need an extra Kinect sensor for 3D-measurements of the environment but rather use only a standard camera and the projector itself for measuring the surface with Structured Light. This approach is slower, but also yields a much higher resolution scan of the environment, allowing for more precise annotations. The system of Tait allows placing user annotations by selecting points in the 3Dpoint cloud model of the environment. We also allow that, but additionally enable the teleoperator to draw directly into the (arbitrarily zoomed) live camera images which allows for a more natural point of view onto the scene for the teleoperator during selection. We can also improve on holes in the captured 3D-point cloud using a CAD-model (if available, see section 3.4) to allow placing annotations outside the initially captured volume. Finally, we performed a large-scale user study with service technicians to prove the practicability of our system in telemaintenance tasks.

The system in its base form can be created using off-the-shelf components, making it very affordable. A prototype of this projector-based AR system was developed and evaluated versus other Augmented Reality display techniques in a telemaintenance user study comprising 110 prospective service technicians. This paper first introduces related research, then describes basic system components and principles as well as possible extensions already realized in our research system, before finally describing the user experiments conducted to evaluate the system’s benefits. 2. RELATED RESEARCH

3. SYSTEM OVERVIEW

The use of Augmented Reality for support in assembly or maintenance tasks has been investigated extensively in the past – a good overview of this field is given in (Wang 2016).

3.1 System hardware In order to allow transmitting live imagery from the telemaintenance support site, the proposed portable AR system needs to include one (or more) cameras that can stream its data over a network. For realizing the AR annotations in the work environment, a data projector is needed. Both of these devices are mounted on a rigid structure attached to a tripod (Fig. 2). If the support environment is sufficiently large, the system can alternatively be directly attached to the machinery (for example inside a control cabinet) which alleviates some of the (user) occlusion problems all projector based systems face. Data from and to those devices needs to be streamed to the teleoperation center; to achieve this in our experimental setup, we gather data on a local processing workstation and transmit it to the teleoperator’s PC using a client/server TCP architecture, where the support site PC is the server which teleoperators can connect to as clients.

Several systems use hand-held or head-worn Augmented Reality displays such as tablets or head-up-diplays to present the user AR instructions superimposed on the environment ((Webel 2013), (Hořejší 2015), (Henderson 2009)). In these instances, AR is mostly used for training purposes, showing the user step-by-step instructions of assembly or repair tasks, achieving improved learning success or faster localization of manipulation areas. Those systems display fixed instruction sets which our system is also able to do if data is available (see section 3.4), but do not include the option for direct teleoperator support. Use of see-through tablets or HUDs for displaying instructions of a teleoperator was for example examined by (Aschenbrenner et al. 2016): similar to our system, the teleoperator could draw annotations into images of the scene, which were tracked in the environment using local features and displayed as annotated image on a portable tablet PC, or as virtual annotations displayed in AR-glasses. While allowing for 3-dimensional virtual annotations, these types of systems require the user to carry around or wear hardware during his work, which may be cumbersome, tiring or limiting his field of view, leading to acceptance issues. Projector-based systems display virtual annotations directly onto the work surface, freeing the user’s hands to perform maintenance tasks. An exemplary system developed by (Volkswagen AG 2017) employed a stationary projector to make the inside of a car chassis visible. This allows the visualization of hidden parts, structures and pitching lines on the vehicle for training purposes. This system is portable as ours, but relies on optical markers for tracking of the system and is also not designed for use with a teleoperator.

Fig. 2. Hardware of the portable AR-system mounted on a tripod. Before use, both camera and projector need to be calibrated and their transformation relative to each other determined. This needs to be done only once and can be dealt with when

A system closely related to our approach is the Projected AR System for Remote Collaboration by (Tait 2013). Similar to our system they first capture the environment’s surface and 510

IFAC INCOM 2018 504 Bergamo, Italy, June 11-13, 2018

Florian Leutert et al. / IFAC PapersOnLine 51-11 (2018) 502–507

the system is assembled. For calibration, we use the approach described in (Moreno 2012). Afterwards, the tripod with the camera/projector can be freely placed in the operational area and system operation can commence.

All annotations can be drawn in arbitrary size and color to ensure they are adequately visible to the user.

3.2 System operation For starting up the AR support system, the service technician places the tripod at the service site and roughly orients it towards the machinery he requires support with. The system then first starts a Structured Light scan, using several scanning patterns (among them Grey-Codes, Micro- and Modulated-Phase-Shifting) to create a dense surface model of the support area (Fig. 6 bottom left). Using multiple scanning patterns slows down the initialization process, but the redundancy allows immediate filtering of the resultant point cloud model so it is almost noise free. The generated surface model is then transferred to the teleoperation center. This whole setup step does not require intervention from the technician and takes approximately 1.5 minutes using a 3.2 megapixel camera and a WUXGA-projector. After this initialization, live camera data is continuously transmitted from the support site. The video resolution can be adapted by the teleoperator to handle different bandwidth scenarios. The teleoperator can use the displayed video stream to analyze the current situation at the remote location and observe things pointed out by the service technician. He thus gains an awareness of the technician's environment and receives spatial cues from him (“this part seems to be damaged”).

Fig. 3. The teleoperator can view the technician during his work and can make annotations in the camera image (top left and right); the technician sees the projected annotations directly in the work environment (bottom).

The teleoperator on the other hand can give spatial cues to the service technician using the projector of the AR support system (see Fig. 3). To do that, he has to select the spot where he wants to place an annotation in his user interface: using his mouse, he can either select a point in the transmitted surface point cloud of the work environment, or simply click on the corresponding spot/pixel in the live camera image. During selection, the user can arbitrarily rotate/zoom into the point cloud as well as zoom into the camera live image to allow a very precise selection of the annotation spot.

Fig. 4. Specifying a text display area by selecting four points in the surface model (left); text is automatically wrapped and limited to this area (right). The AR support system then uses this given input – the annotation to draw and the 3D-location where it should appear, as well as the surface map of the projection environment to compute the pixel location in the projector image that will make this annotation appear at exactly the indicated 3D location. This computation is done using the pre-determined camera (and projector) calibration data as well as the surface model of the environment to calculate where the projector and camera ray will coincide on the projection surface.

He also has to select the type of annotation he wants to show to the user. Several options are currently available for the annotations, including symbols (for example single points, lines, arrows, circles, rectangles of arbitrary size and color (Fig. 3/5 left), text annotations that can be freely placed by selecting a base line (two points), and even perspectively distorted (by selecting four points to compute the necessary homography). In the latter case, text is also automatically wrapped and limited to fit into the specified area (Fig. 4). Other options are highlight animations that outline a specific spot or machinery part and draw the user’s eye towards it. One example here is a single point surrounded by a continuously growing/shrinking rectangle (Fig. 5 right) – where the small point alone would be difficult to notice at first glance, the large moving animation allows the local technician to quickly locate the spot). Finally there is the option to use free-hand drawing – the teleoperator can use his mouse to freely draw onto the environment using his mouse, as he would in a painting program.

The teleoperator can see his annotations as 2D-overlay in his user interface (optional, for immediate feedback while drawing), but also see them appear (with a slight delay) in the live camera image, allowing him to verify the information he provides to the technician is displayed correctly. This allows a teleoperator to give the technician precise instructions with spatial cues (“First remove this component by removing the screws here and here…”) by highlighting the individual components these instructions relate to. 511

IFAC INCOM 2018 Bergamo, Italy, June 11-13, 2018

Florian Leutert et al. / IFAC PapersOnLine 51-11 (2018) 502–507

The AR support system thus allows a bi-directional communication with spatial reference even remotely. Both parties can point out spots or machinery parts to each other, which – combined with the voice communication – allows them to work together similar as to how they would if they were to be actually sharing the same workspace.

505

from the surface scan is available to compute the annotation position in the work environment, this data is simply derived from the (transformed) CAD-model. Since this model will include all the surfaces of the machine to be maintained, depth data will be available for all parts of the machine, even if no measurements could be taken there before. Depth values directly derived from the initial Structured Light scan are still preferred since the alignment computed with ICP cannot be assumed to be as precise as direct measurements. If the CAD data is furthermore annotated, containing for example a list of names and 3D location of individual machinery parts, these parts can then be automatically highlighted: the user simply selects them from a dropdown list, and their contours are then highlighted (Fig. 7) without teleoperator intervention using their known 3D location from the CAD data. This allows for quickly localizing system components, even if their position is not precisely known to the teleoperator beforehand. If maintenance or repair instructions are associated with the CAD data, automatic step-by-step instructions with spatial reference can be generated: this is done by displaying a textual description of each step, as well as highlighting the components that step is referring to with the projector (for example: “Loosen the screws of the motor drive block” while highlighting all the screws to be removed with the projector). This annotated CAD data of course first has to be provided and prepared. This could be done by the manufacturer of the equipment itself (who often already produce textual maintenance instructions) or by the service provider for the remote telemaintenance services. The assumption that future production equipment will include electronic maintenance and part descriptions does not seem unreasonable.

Fig. 5. Arrow symbol in zoomed-in camera image in the telematics user interface showing the high accuracy of the annotations: 2D-overlay (central arrow shape) and projection (visible as halo, left); highlight animation making the annotated bolts immediately visible (right). 3.3 System extensions The basic AR support system presented here does neither require any prior information on (or preparation of) the support site, nor calibration to be performed by the technician. Neither optical markers in the environment are necessary nor information on the system that is to be maintained. However, often the teleoperator will have additional information on the system telemaintenance was requested on. If CAD-data of it is available, this data can be used to enhance the surface scan data of the working environment, allowing for automatic labeling and highlighting of system components as well as automatic repair and maintenance instructions to be displayed with the AR support system.

Besides the CAD-data, another possible extension of the AR support system is the inclusion of a tracked input device. To make it usable in the AR-system, the tracker coordinate system first needs to be determined relative to the support system’s coordinates. This can easily be achieved by measuring the coordinates of several points with known location in the point cloud (for example the corners of the control cabinet to be maintained). Without the tracking system, the service technician can only give spatial clues to the teleoperator by pointing out spots or items with his hands, visible in the transmitted live camera image. When using a tracked input device the technician can also draw annotations using the projector-based AR-system: using the drawing positions provided by the localized tracking system, he can make free-hand drawing annotations directly on the working surface. Tracking data and annotations are transmitted back to the teleoperation center, so the operator can see the annotations of the technician in 3D in his surface point cloud as well as in the camera image (Fig. 8), allowing for a possibly more precise spatial reference than by just pointing with gestures. If combined with CAD data, the tracked input device can also be used to display information of system components, for example the name of the machine part the technician is currently pointing to, as well as additional available information (for example, standard values this electronic component is supposed to have etc.).

In order to use the available CAD information it first needs to be located in the projector coordinates – that is, aligned to the Structured Light surface point cloud. To achieve this, our software first creates a point cloud from the given 3D-CADmodel (Fig. 6) by replacing all surfaces of the model with adequately densely placed points. This point cloud is then evenly sampled, and aligned with the Structured Light point cloud via the Iterative Closest Point algorithm (Besl 1992). This algorithm requires an initial estimate of the position of the model to be fit, but is fairly robust – assuming a frontal position (projecting toward the machine to be maintained) usually leads to satisfying convergence of the point clouds and thus an estimate of the CAD model transformation. Once these coordinates are known, the CAD point cloud data can be used to fill holes in the surface model of the projection environment and thus enlarge the area of the camera image in which annotations can be made (section 3.3): if no depth data 512

IFAC INCOM 2018 506 Bergamo, Italy, June 11-13, 2018

Florian Leutert et al. / IFAC PapersOnLine 51-11 (2018) 502–507

predefined order, replace the servo amplifier with another and re-connect everything again in the right order. None of the test persons had done this specific task before, although they all were trained in similar tasks. The individual steps of how to perform the task had been predetermined – all test persons were given the same instructions. However, the methods on how these step-bystep instructions were presented to them differed: one group (n=50, groups of 10 persons for each variant) was performing the task without a teleoperator using only local instructions; the other (n=60) was supported by a remote expert. Local instructions were given as (a) text (printed out), (b) per ARglasses (instructions given as text in the glasses or (prerecorded) audio, while simultaneously highlighting relevant machine components in the AR-glasses) or (c) via projectorbased AR (this contribution, instructions given as (projected) text or (pre-recorded) audio, with simultaneous projected highlighting of relevant machine components).

Fig. 6. The CAD-model of the control cabinet is transformed into a point cloud (top left and right); a surface scan (bottom left) is used to align the model via ICP (bottom right, scan data as points, CAD translucent).

The second, remotely assisted group had the instructions presented by a teleoperator. Instructions were given verbally by the remote expert leading the user study, with additional highlighting of relevant machine components (where applicable) done manually in the teleoperation interface. The technicians in this case had the opportunity to make further inquiries to the teleoperator if necessary. The different modalities with which instructions were given and/or presented were in this case (see also Fig. 9): 

Voice only (telephone connection)

Fig. 7. Automatic localization and highlighting of machine components, using annotated CAD-data.



Video only (technician filmed the work environment with a tablet computer, no AR annotations)

While these extensions of the AR support systems can be helpful in some application scenarios they are completely optional. The base system does not require any additional hardware or calibration steps to function and can easily be deployed for quick telemaintenance support. This base version of the AR system was also the one tested in the user studies.



Tablet-AR (same as last, but with AR annotations in the live video shown on tablet (see-through AR))



AR-glasses (same as tablet but with annotations presented in AR-glasses)



Tablet- screenshot (technician could take a still shot of the work environment into which the expert could make 2D-annotations)



Projector-based AR (the system presented here).

4. USER STUDY In order to evaluate the performance and usefulness of different AR support systems in telemaintenance scenarios, a large user study was performed at a technician school. Prospective technicians in their final year of training were asked to evaluate the systems, performing a regular maintenance task in a robot control cabinet. Their technical background and their training made them ideal candidates for evaluation, and a large number of test persons (n = 110) were committed to the tests.

The user group here was again divided into 10 persons each. Each person performed the task only once (to avoid learning effects). During the experiment, performance was evaluated by measuring time taken to perform the task and counting errors made during performing the maintenance (for example cables not properly connected etc.). After the experiment, each participant additionally filled out a questionnaire, rating the performance of the AR-system as well as the corresponding user interface (handling of tablet/glasses etc.). Questionnaires included – among others - SART, NASA TLX, QUESI and ISONORM.

4.1 Test scenario and methodology The maintenance task to be performed was the substitution of a servo amplifier in an industrial robot control cabinet. To perform this task, the technicians had to remove several mounting screws, connector cables and I/O plugs in a 513

IFAC INCOM 2018 Bergamo, Italy, June 11-13, 2018

Florian Leutert et al. / IFAC PapersOnLine 51-11 (2018) 502–507

system will be a promising support in telemaintenance scenarios. A more detailed analysis of the user evaluation will follow.

4.2 Preliminary results The test results are currently still being evaluated. More detailed results of the study will be published in the future. The users in general found the AR-support helpful, but also mentioned some limitations when working with the different AR-systems: For example, the handling of the tablet was reported to be cumbersome – users preferred to have their hands free for work. With the AR-glasses system, the users reported a limited field of view in which the annotations were visible as well as handling problems if the user already was wearing regular glasses. The problem most reported with the projector based system was occlusion by the user. Despite those limitations the users strongly preferred the AR-systems to the voice-only based instructions.

5. CONCLUSION A projector-based Augmented Reality system for remote support in telemaintenance was presented. The system requires only a camera and projector, is portable and can be deployed at the support site on-demand within minutes. This system enables bi-directional communication with spatial reference to the support environment, allowing a teleoperator to guide a service technician efficiently through even complicated service or repair instructions as if he were present on-site. Possible system extensions were considered, and the system was evaluated in a large-scale user study with prospective service technicians. First results showed the proposed system was received well and allowed reducing service times more than other support systems. Future research will focus on making the system even more mobile by allowing wireless transmission as well as reducing size of the local setup (projector and processing computer).

First preliminary results indicate that the projector-based AR support system outperformed (or performed at least equally to) all other variants (with or without AR). Table 1. Task execution time (in minutes) with the different support variants Variant Voice only Video only Screenshot Tablet-AR AR-glasses AR-projector

Mean 20:46 20:04 18:18 18:19 18:21 15:08

Std. deviation 04:37 03:12 02:58 02:42 03:27 03:04

Std. error 01:27 01:00 00:56 00:51 01:05 00:58

REFERENCES Aschenbrenner, D., Latoschik, M., Schilling, K. (2016). Industrial maintenance with augmented reality, two case studies. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, pages 341-342, ACM. Besl, P. J., McKay, N. D. (1992). A Method for Registration of 3-D Shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, pp 239-256, IEEE. Henderson, S. J., Feiner, S. (2009). Evaluating the benefits of augmented reality for task localization in maintenance of an armored personnel carrier turret. In: 8th IEEE International Symposium on Mixed and Augmented Reality, pp. 135-144, IEEE. Hořejší, P. (2015). Augmented reality system for virtual training of parts assembly. Procedia Engineering, 100, pp. 699-706. Moreno, D., Taubin, G. (2012). Simple, accurate, and robust projector-camera calibration. In: Second International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, pp. 464-471, IEEE. Tait, M., et. al. (2013). A projected augmented reality system for remote collaboration. In: IEEE International Symposium on Mixed and Augmented Reality 2013, pp. 1-6, IEEE. Volkswagen AG (2017). X-ray vision gives Volkswagen a virtual training advantage. Online, http://www.trackingchallenge.de/content/tc/content/en/ObjectTracking/Applications.html. Wang, X., Ong, S. K., Nee, A.Y.C. (2016). A comprehensive survey of augmented reality assembly research. Advances in Manufacturing 4(1), pp. 1-22. Webel, S. et al. (2013). An augmented reality training platform for assembly and maintenance skills. Robotics and Autonomous Systems 61(4), pp. 398-403.

Table 2. Number of errors during task execution Variant Voice only Video only Screenshot Tablet-AR AR-glasses AR-projector

Mean 0.5 0.4 0.5 0.4 0.8 0.1

Std. deviation 0.83 0.69 0.71 0.84 1.13 0.32

Std. error 1.08 0.69 0.7 0.84 1.13 0.31

Table 3. Overall workload-score (NASA TLX) of task execution Variant Voice only Video only Screenshot Tablet-AR AR-glasses AR-projector

Mean 39.17 36.25 31.5 36.41 34.17 25.58

Std. deviation 11.77 12.09 16.56 12.51 12.9 16.85

507

Std. error 3.72 3.87 5.23 3.96 4.08 5.33

Tables 1 to 3 show results of the second test group regarding task execution times in minutes, number of errors made during execution and the overall workload-score value from the NASA TLX test. The projector-based support system outperformed all other variants in these categories (lower values are better). (Pairwise) Significance is still being evaluated; however, these results already indicate that the projector-based AR 514