A Multi-Sensor Interface to Improve the Teaching and Learning Experience in Arc Welding Training Tasks

09/03/2021
by   Hoi-Yin Lee, et al.
0

This paper presents the development of a multi-sensor extended reality platform to improve the teaching and learning experience of arc welding tasks. Traditional methods to acquire hand-eye welding coordination skills are typically conducted through one-to-one instruction where trainees/trainers must wear protective helmets and conduct several hands-on tests with metal workpieces. This approach is inefficient as the harmful light emitted from the electric arc impedes the close monitoring of the welding process (practitioners can only observe a small bright spot and most geometric information cannot be perceived). To tackle these problems, some recent training approaches have leveraged on virtual reality (VR) as a way to safely simulate the process and visualize the geometry of the workpieces. However, the synthetic nature of the virtual simulation reduces the effectiveness of the platform; It fails to comprise actual interactions with the welding environment, which may hinder the learning process of a trainee. To incorporate a real welding experience, in this work we present a new automated multi-sensor extended reality platform for arc welding training. It consists of three components: (1) An HDR camera, monitoring the real welding spot in real-time; (2) A depth sensor, capturing the 3D geometry of the scene; and (3) A head-mounted VR display, visualizing the process safely. Our innovative platform provides trainees with a "bot trainer", virtual cues of the seam geometry, automatic spot tracking, and a performance score. To validate the platform's feasibility, we conduct extensive experiments with several welding training tasks. We show that compared with the traditional training practice and recent virtual reality approaches, our automated method achieves better performances in terms of accuracy, learning curve, and effectiveness.

READ FULL TEXT VIEW PDF

page 1

page 2

page 4

page 5

page 6

page 7

page 8

04/23/2018

StreamBED: Training Citizen Scientists to Make Qualitative Judgments Using Embodied Virtual Reality Training

Environmental citizen science frequently relies on experience-based asse...
01/30/2019

Virtual Manipulation in an Immersive Virtual Environment: Simulation of Virtual Assembly

To fill the lack of research efforts in virtual assembly of modules and ...
09/21/2022

Improved Perception of AEC Construction Details via Immersive Teaching in Virtual Reality

This work proposes, implements and tests an immersive framework upon Vir...
03/19/2019

Some Experimental Results of Relieving Discomfort in Virtual Reality by Disturbing Feedback Loop in Human Brain

Recently, great progress has been made in virtual reality(VR) research a...
06/13/2018

Automated Performance Assessment in Transoesophageal Echocardiography with Convolutional Neural Networks

Transoesophageal echocardiography (TEE) is a valuable diagnostic and mon...
11/11/2019

Enhancing User Experience in Virtual Reality with Radial Basis Function Interpolation Based Stereoscopic Camera Control

Providing a depth-rich Virtual Reality (VR) experience to users without ...
05/04/2021

Reliving the Dataset: Combining the Visualization of Road Users' Interactions with Scenario Reconstruction in Virtual Reality

One core challenge in the development of automated vehicles is their cap...

I Introduction

Arc welding is one of the most common material fusing methods in modern manufacturing [ghosh2017pulse]. In its most basic form, it uses a controllable electric current to melt a joining metal that, once cooled, binds two different metallic parts together [khan2007welding]. Due to its strong and enduring joining properties, welding is used across numerous economically important fields [jeffus2020welding], such as automotive and aerospace industries, shipbuilding, steel construction, oil and gas pipelines, to name a few instances. Despite its widespread use, teaching and learning a proper welding technique has historically presented many challenges to both trainers and trainees [liu2014tutorial].

With the traditional teaching and learning approach, the instructor first explains the principle, shows videos of welding tasks, and performs sample welds as a demonstration [asplund2020lessons]. Trainees are then asked to stand around the welder to observe the process through their helmets and receive one-to-one guidance when necessary, as shown in Fig. 1. However, this approach has several shortcomings. The strong and harmful light emitted from the electrically-induced welding process can lead to serious consequences to a welder’s eye and skin [weman2011welding, antonini2003health, althouse2004modern]. Specialized protective equipment such as welding helmets and gloves must always be worn to avoid injuries [althouse2004modern]. Yet, many important details of the welding process are barely observable through the helmet from such a distance. Beginners typically struggle to obtain a clear spatial notion of what is happening on the other side of the protective helmet, and how the different parameters affect the quality of a weld.

Fig. 1: (a) Arc welding; (b) Illustration of traditional training approaches, where the trainer needs to demonstrate the welding skill one by one.

To address the limitations of traditional training approaches, researchers have developed didactic training platforms based on VR and MR systems [chung2020research, huang2020research, jo2010virtual, isham2020mobile, yang2010virtual, chambers2012real, mavrikios2006prototype, fangming2019real, kobayashi2003skill, doolani2020review]

. These platforms are typically designed to help beginners to familiarize themselves with the procedure and receive guidance on the movement of the welding torch. However, most of these works use ideal virtual environments to simulate the

high-energy and dangerous welding process. Hence, no actual interaction with the environment occurs, which produces a synthetic perceptual experience for the users [chung2020research, huang2020research, jo2010virtual, isham2020mobile, yang2010virtual, chambers2012real, mavrikios2006prototype, fangming2019real, kobayashi2003skill]. Training in these types of simulation-based environments may hinder the psychological adaptation that practitioners acquire by conducting welding tasks in the field, such as fear management, thermal sensation, etc.

To address the problems in the current practice and VR teaching approach, in this paper we propose an automated extended reality (XR) training assistant platform, see Fig. 2. The proposed system consists of a high dynamic range (HDR) camera to monitor the welding spot in real-time, an RGB-D sensor to capture the 3D geometry of the scene, and a VR headset to visualize the process safely. Images from these sensors are registered in a single image plane. During the training procedure, the RGB-D sensor detects the possible welding region, and an algorithm automatically finds the seam and generates the desired welding path. Visual cues provide the user with valuable extended reality guidance on the conducted weld motion in real-time. A scoring system is also included to quantify the performance of the trainee upon completion of the task. All visual information is displayed on a screen in an XR mode, thus, the welding process can be simultaneously observed by the trainee, the trainer, and other participants. The ultimate goal of this XR system is to help trainees learn the proper movement of the welding torch and gain confidence for trainees in real welding tasks. We conducted a series of experiments to evaluate our proposed method in terms of (1) accuracy, (2) learning curve, and (3) effectiveness in teaching and learning.

Compared with existing methods, the developed automated system has the following original contributions:

  • It detects and overlays a desired welding path over the workpiece for trainees to follow.

  • It provides instantaneous motion recommendations to improve the task performance.

  • It quantifies and visualize the performance of the task.

  • It enables to simultaneously display the welding process in real-time to the group of participants.

The rest of this paper is organized as follows: Sec. II introduces the architecture of the system. Sec. III presents the experiments. Sec. IV gives the final conclusion.

Ii Methods

Fig. 2: Our proposed welding training bot system.

Ii-a System Overview

The XR welding training bot system setup and the experimental environment are presented in Fig. 2. The relationship between the different components involved is shown in Fig. 3. It includes an RGB-D camera, an HDR camera, a VR headset with a controller, a computer, and a welding torch. Cameras are cross-calibrated and placed at around 10-degrees from the surface normal and facing towards the welding region. All captured images are sent to a computer for processing. The VR headset is then used to provide real-time results, i.e., a 3D model of the workpiece and a streaming video.

Fig. 3: Data flow of the system.

The workflow depicted in Fig. 3 describes the following steps: (1) the RGB-D camera takes a photo of the welding area to produce the point cloud data of the workpiece; (2) the seam localization algorithm is adopted to locate the possible welding seam position; (3) the HDR camera captures the welding process and traces the center of the electric arc in real-time; (4) information is registered into a 2D image and streamed to a webpage; (5) the VR headset access the 2D live streaming video and the 3D model via the Wi-Fi.

Once the system starts, we obtain color and images from the RGB-D camera and continuously receive a grayscale image from the HDR camera. A point cloud is formed with the RGB-D matrix and a groove detection algorithm is applied to segment the welding groove on the workpiece [zhou2021path, Peng2020APC]. A seam localization algorithm is implemented to automatically determine the welding seam.

During the welding process, the electrode from the welding torch generates an electric arc with the workpiece, which generates a strong light emission. An arc localization algorithm and an XR bot trainer system are applied to trace the welding progress and provide instant advice to the user. Upon completion of the welding task, an evaluation metric is calculated to assess the overall performance. The welding trajectory, average error, and score are presented on-screen to visualize and quantitatively describe the performance of the user.

Ii-B Seam Localization

A groove is a channel between the edges or the surfaces of two metal workpieces [american2010aws]. The RGB-D camera captures one image that covers the whole welding area of the workpiece to locate the groove. Together, the color and depth images form a point cloud . The proposed detection algorithm uses the difference in edge intensity to automatically find the path of the groove, where local neighborhoods in are computed to segment the groove’s approximated location [zhou2021path, Peng2020APC]. In Fig. 4, the groove points are conceptually presented in green color over the channel. Computation that contains a bulk of data from the point cloud will result in a long processing time (several points are involved in finding the seam, which is typically a simple straight line for beginners). An effective solution for locating the seam is therefore implemented: After the groove’s segmentation, the seam is calculated by using a two-dimensional coordinate system to shorten the computational time, where all points in are projected to a 2D image , as indicated with red color in Fig. 5.

Some noisy data may also be included in the groove detection, which interferes with the precision in the seam localization method. A kernel convolution noise filter is used on the to remove redundant point. Canny edge detection [ding2001canny] is then applied to sharpen the edge intensity of . As inspired by the method in [xiong2019robust] that uses an edge map to extract line segments, we compute a point-line connection edge map to achieve a high-precision result. Points are interconnected by lines to generate an edge map, as depicted in Fig. 5. Euclidean distance is calculated to determine the length of a line as: , where and are the endpoints of a line. As the welding seam is typically longer than the side of the workpiece, short lines that represent the side edge are redundant. Therefore, our method only preserves long lines in , as shown in Fig. 5.

Fig. 4: (a) Two different types of workpieces: Fillet welding and Butt welding workpieces; (b) The groove detection algorithm is applied to locate the possible welding area. Groove points are displayed in green color; (c) Welding seams are indicated in orange.
Fig. 5: (a) 3D coordinates of the groove are projected onto a 2D image in red; (b) The output of lines : most of the around the seam are those orange and green long lines; blue lines which represent the side edge of the workpiece are generally shorter; (c) Lines in blue and green in (b) are removed due to its length and slope.

To extract the main characteristic lines, parameters of lines are taken into consideration. An averaged slope is used to thinner the seam location. The average slope of can be solved as follows:

(1)

where indicates the -th line and represents the slope. with a slope that are within the range of the average slope plus the clearance are kept. The range is defined as follows: for as the clearance. With this method, the possible seam region is narrowed down to the orange region, as depicted in Fig. 5.

Generally, a line is represented with two points and the seam is typically passing through two quadrants on a coordinate plane. Thus, a KD-Tree algorithm [bentley1975multidimensional]

is introduced to classify the endpoints

into groups according to their quadrants. The two quadrants that contain most of the endpoints indicate the orientation of the seam. The mean value of in these two quadrants is used to solve the seam characteristic line . By mapping the depth information in with , 3D coordinates can be obtained. Numerous of segments are assigned to the based on the distance between and . In each segment, a point is added to indicate the segment location for the later guidance usage.

Ii-C Electric Arc Localization

Fig. 6: Overview of the electric arc and its center during the welding process.

Once the welding commences, a strong light from the electric arc is generated [weman2011welding]. By computing the center of the arc as shown in Fig. 6

, the intersection point between the center of the electrode and the seam can be located. However, the light intensity of the image in the HDR image is affected by the strong light; only the region around the seam is displayed in real-time. Other areas are shown with the previous condition before the welding begins. The region of the electric arc and the electrode share a high-intensity level in a gray-scale image. These areas are extracted with a binarization approach to generate a feature image. Nevertheless, some light reflections may also be included, as the binary map is susceptible to noise

[xiong2019robust], as depicted in Fig. 7. A “Dimension Removal Algorithm” and an “Electric Arc Tracking Algorithm” are applied to remove noisy data.

Fig. 7: (a) The welding torch blocks the view; (b) Some light reflections from the surroundings share the similar light intensity level; (c) Noises in (b) are removed; The center of the contour labeled in pink is away from the actual center of the electrode; (d) The correct center can be found with the proposed method; (e) Comparison of the center before and after the use of the method; (f) Demonstration of the verification process.

Ii-C1 Dimension Removal Algorithm

First, high intensity regions are represented in series of points in contours . Each area in is computed by iterating as follows:

(2)
(3)
(4)

where denotes the points in contour , and indicates the contours that their areas passed the condition. The scalars and are the dimensions of a tiny reflection dot and a large reflection region from metal. All contour areas in are measured and compared. Only those contours with a dimension larger than and smaller than are taken into .

Fig. 8: Overview of the welding training system. Placed a workpiece, an RGB-D matrix and an HDR image are obtained. The groove detection and the seam localization algorithms are executed to find the welding path. The electric arc localization is applied to the HDR image to find the center of the arc. The yellow path indicates the welding trajectory with the seam line shows in blue. Images and the results computed are registered into a 2D frame and uploaded to the server. Electronic devices, like a VR headset, are used to access the real-time output via a webpage.

Ii-C2 Electric Arc Tracking

In some cases, part of the electric arc may be blocked by the welding torch, as depicted in Fig. 7. It may lead to the center of the contour off from the actual center of the electric arc. A complete circle is used to locate the center precisely. We aim to find a circle that includes all contour points in with a minimum radius available. We first take the mid-point between each with as the center and their half distance as the radius of the circle. Then, we validate this circle by confirming that each is bounded. By iterating each possible pair of with , a valid circle with the smallest radius is found.

Nevertheless, three points may also be able to form a valid circle with a smaller radius. By using (5) and by iterating each , an equation of a valid circle can be solved.

(5)

where and are the center and the radius of the smallest circle generated with three points. is the minimum radius value between and , and it needs to be smaller than which indicates the general dimension threshold of an electric arc formed with different welding method. Therefore, the smallest circle that enclosed all the is solved by the above method with the center reflecting the center location of the electric arc.

Ii-C3 Electric Arc Verification

The electric arc is determined based on the information in the current frame. There is a possibility that two similar sizes of circles from different contours may appear at the same time, thus creating ambiguity for the system. Inaccurate arc location outputs will occur, thus, a verification check is needed to verify the output. As it is unlikely for the welding torch to move a long distance within a short time (e.g., in a single frame). We use this condition as the determinant factor for the verification.

The region that indicating the electric arc is generally sharing the greatest circle among others. The algorithm starts to compute from the largest one to verify the results effectively. In Algorithm 1, the entire verification process is presented. All valid and found from the contour are sorted based on in descending order, that is, the center of the largest circle is . By comparing the distance between the with the previous state location , the circle can be verified. By iterating each , the largest valid circle around can be located, as depicted in Fig. 7.

Input: ,
Output: ,
get()
for  to  do
       , = getCircle2pt()
       , = getCircle3pt()
       , = minCircle(, , , )
       filterCircle(, )
      
sortMaxRadius(, )
if  is not None then
       for  to  do
             = distanceCompare(, )
             if  then
                  
                   return ,
                   break
                  
            
      
Algorithm 1 Circle Verification.

Ii-D Welding Direction Determination

The direction of the welding movement mainly varies based on the handedness of a welder and the seam location. In a flat welding position with a horizontal seam line, a welder with a right-hand dominant hand usually performs the welding from right to left. In a vertical position, welding starts from the bottom, and vice versa. Because of it, a welding direction determination function is applied to resolve the welding direction. Once the welding begins, the system starts to record the movement of for certain frames. The starting point is computed based on the sensory data captured from the first few frames. The moving direction is determined by comparing the coordinate difference between the starting point and the current point . For instance, if is on the left side of after 20-frame, it implies that the welder is going from left to right. With such information, the guidance from the XR bot trainer can be adjusted accordingly to fit the situation.

Ii-E XR Bot Trainer

The proposed XR bot trainer is responsible for providing instant welding guidance to the user via a head-mounted VR display. Beginners are advised to follow the path and the suggestion provided therein. During the welding process, the suggested coordinate point that is generated from the segmentation of the seam localization, is updated by iterating every point, starting from the . By comparing the coordinates of and , we can obtain the moving velocity and the system can generate visual cues to the user automatically. If the position and the velocity are similar to the suggestion, a circle that indicates turns green and the user is advised to maintain it. If is off from the targeted location and is moving slower than the suggested velocity, then the circle turns red. If is away from the suggested point and it is moving too fast, the circle turns blue.

An instant error and an average error regarding the user’s performance are measured automatically by the XR bot trainer and can be displayed in real-time. After the completion of the welding, the system can generate a final score and plot the welding trajectory to visualize and evaluate the overall performance of the user. The average error and the score are calculated as follows:

(6)
(7)

where denotes the instant error and indicates the average error. is the center of the electric arc, is the suggested coordinate, and is the total number of involved from to . As the average error reflects the overall performance, it is taken into the calculation of the final score .

Ii-F Accessibility of the System

As the proposed method is aimed to lower the entry barrier for welding, it supports cross-platform usage. The real-time results and the 3D model of the workpiece are available on the webpage; they can be accessed through any electronic device via a wireless connection. The instructor can observe the live welding process of the trainee with others simultaneously through a display device. We developed a prototype VR welding headset to receive immediate assistance from the XR bot trainer during the welding process. This headset is not only used for accessing the system but also for protecting the user from the radiation of the arc.

Iii Results

Fig. 9: Experiment Setup of the proposed XR welding training system.
Fig. 10: Variety workpieces with the blue lines indicating the seam’s location identified by the system. (a) Fillet welding metal workpieces; (b) Fillet welding plastic workpieces; (c) Butt welding plastic workpieces.

Iii-a Setup

To validate the proposed method, a special setup system is built with two vision sensors that allow the whole welding process to be captured. The architecture of the XR system is presented in Fig. 8 which contains an RGB-D camera (Intel Realsense SR305), an HDR camera (New Imaging Technology MC1003), a VR headset with a controller (Oculus Quest 2), a computer, and a standard TIG welding torch. To have a better evaluation of the system, numerous experiments are carried out. Two trainees and one experienced welding trainer are asked to perform several welding acts to have a clear comparison. One of the beginners is asked to learn from the traditional (i.e. current practice) approach while the other learns with this multi-sensor system. Both of them are sharing the same instructor to maintain fairness. The conducted experiments are focused on: (1) Accuracy of the welding path, (2) accuracy of the arc location, and (3) effectiveness of the use of this system in teaching and learning.

Fig. 11: (a)-(c) Examples of the welding path and the guidance support provided; The orange and the white circles indicate the start point and the end point location of the seam. (d) Welding trajectory, errors, and score provided; (e) View in the VR welding headset.

In the experiments, a horizontal fillet TIG welding is chosen. A non-consumable electrode is used to generate an electric arc. The weld and the molten pool are shielded by inert shielding gas from environmental contamination [weman2011welding]. To obtain a high-quality outcome in the weld, high precision in the performance is required. As the precision will significantly affect the TIG welding results, the difference between the use of these two methods can be clearly judged by this criteria. Mild steel workpieces with 10-mm thickness and a straight line welding path are arranged. Although a fillet weld workpiece is prepared, trainees are not required to add any filler at the beginning until they are capable to do so.

An example of the setup is shown in Fig. 9. First, the XR bot trainer records the traditional welding process conducted by the instructor. Then, the welding techniques are interpreted visually by the instructor to the trainee while replaying the recordings. Afterward, the trainee performs the welding process with the full training system, and the instructor and other trainees can observe the process from the computer simultaneously. Once it is done, the recordings can be replayed to discover minor details of the performance.

The proficiency at a task is positively related to the number of practices a trainee has [anzanello2011learning, fioretti2007organizational, newell1981mechanisms]. More practices lead to a lower error rate. As numerous trials are conducted, trainees will be more proficient at a task. We compute a learning curve to quantify the effectiveness of this system. To this end, the number of trials and the score (which reflects the success of each test) is recorded for constructing the learning curve.

Fig. 12: Contours are in blue color. Red circles indicate the radius and the center .

Iii-B System Analysis

Iii-B1 Welding Path Generation Analysis

The seam is located with the results found from the groove detection. Different groove types may lead to different seam localization results; fillet welding and butt welding workpieces with various orientations were placed to test the robustness of the algorithm. Fig. 10 shows that the seam was correctly located with diverse types and poses. Once the welding begins, the XR bot trainer records the starting and the current welding point. The results prove that the algorithm can determine the welding direction accurately by comparing these two points, see Fig. 11.

Iii-B2 Electric Arc Localization Analysis

There are uncontrollable light sources around the welding environment. Thus, a venue that contains different lighting was chosen for this experiment to validate the robustness of the developed noise removal function. Light reflections and the electric arc were included in the raw data and bounded by blue contours, as shown in Fig. 12. By adopting the proposed dimension removal and the electric arc tracking algorithm, noises were removed, thus leaving the desired arc region.

The welding torch was placed at various angles, including part of the arc lighting blocked intentionally to test the robustness of the proposed electric arc tracking algorithm. Examples are shown in Fig. 12, where the complete electric arc was not captured by the camera; a circle with a minimum radius was formed. The center of the electric arc was computed successfully and located accurately in various situations, see Fig. 12. It proved that the proposed electric arc tracking algorithm is effective in determining the welding point.

Based on the motion of the electric arc and the welding trajectory, an appropriate automated suggestion was provided, as presented in Fig. 11. When the trainee had a slow pace of welding, the suggested target circle turned red to alarm the user. When the velocity conducted by the user was excessively high, thus leading to an insufficient time to melt the material, the circle turned blue. When the welding point and the advised coordinate were similar, the circle turned green. This shows that the bot trainer is capable to produce reasonable and practical XR instruction for the trainee automatically.

Iii-C Comparison and Effectiveness Testing

As the XR bot trainer was new to the trainee, his welding performance in the first few trials were poor. After several practices, the trainee was familiarized with the system and has mastered the welding techniques; he was able to add a filler successfully during the welding in the last attempt, see Fig. 13. His welding trajectory was recorded and the results of his performance are plotted in Fig. 14.

Fig. 13: Comparison of the welding performances under the standard learning approach and the proposed method. (a) Standard learning approach: failed to maintain a proper movement speed; (b)-(c) Proposed method: learnt effectively and capable to include filler in the weld.
Fig. 14: Comparison of the suggested and the actual welding trajectory performed by the user with an instant error. (a), (c) Trajectories and the instant error from Fig. 13(b); (b), (d) Trajectories and the instant error from Fig. 13(c).
Standard Approach Proposed Method
Trial Filler Included? Success? Filler Included? Success?
1-3
4
5
6
7
8
TABLE I: Results of the experiment for each welding trial.
Fig. 15: Scores and average errors of the welding trial done by the trainee with our developed system.

Compared with the standard teaching method, it is more convenient to explain the welding process and techniques to beginners with the proposed approach. By referring to the results shown in Table I, it helps the trainee to achieve good welding efficiently. In the first three trials, both trainees failed to conduct a long welding process. The trainee achieved the first success after five trials in the standard learning approach; three trials in the proposed method, see Fig. 13(b). The filler was added successfully in the act at the eighth trial with the use of our XR system, but not with the traditional approach.

Quantified results of the trainee with the use of the developed solution are plotted in Fig. 15. The horizontal axis indicates the number of trials which reflects the number of practices. The left vertical axis indicates the score which reflects the welding proficiency of the trainee. From the results, mistakes made by the trainee reduced with time; the average error is decreasing with more training he had. The performance of the trainee has improved rapidly with the proposed XR bot trainer. Referring to the trend of the score in Fig. 15, it reflects that his performance is having exponential growth; a steeper learning curve is achieved. They were also having higher confidence and better fear management in performing real welding than learning from traditional methods. It can be concluded that our proposed method helps to improve the trainee’s welding skill proficiency in a short time.

Compared with other VR welding training studies mentioned previously on simulating the virtual welding act, our XR bot trainer can provide immediate support to real-world welding. The motion guidance, the welding trajectory, and the quantified performance evaluation can be given with respect to the instant action in the real world. Despite the instant support provided, our system cannot compute and update the information of the seam location in real-time. As the welding path is solved with the past point cloud data before the welding process begins, it may fail to update the local geometric data to the global information of the seam during the welding. However, instead of learning and practicing the welding technique in a safe virtual simulated environment, we proposed this automated XR training assistant system, providing real-time bot teaching and learning support platform for the trainer and trainees. Combined with the two vision sensors, the welding path, and the recommended moving velocity are provided automatically; the prompt welding location spot can be determined by the electric arc localization. The instant performance can be accessed via various types of displays, such as head-mounted display devices and monitors, that the instructor can supervise remotely. As virtual welding is only capable of providing a synthetic perceptual safe experience, the sense of fear from the arc welding cannot be triggered; it cannot help trainees overcome the fear to have proper fear management compared to ours. The experiment results proved that our methodology is effective in teaching and learning.

During the experiments, several faulty results have revealed the limitation of the system. First, only thick materials can be considered. A thin layer such as sheet metal is excessively thin, that the seam is obscure; the depth difference in the groove is insignificant for the system to locate the seam properly; it may easily be misjudged as one large workpiece in Fig. 16. Second, even when the welding trajectory matches the suggested one, the final results may have a chance that it is off from the seam, see Fig. 16. It is mainly caused by the limited view of the camera; the pose of the welding torch cannot be completely reflected. Third, the connection quality may affect the user experience. As it highly relies on the data transmission speed, a stable intranet connection with a high bandwidth is required.

In the accompanying multimedia file, we demonstrate the performance of the system with multiple experimental videos.

Fig. 16: Some faulty examples. (a)-(b) Seam between sheet metal workpieces cannot be located as the depth difference is small; (c) Due to the limited angle of view from the system, the result is off from the seam.

Iv Conclusion

In this paper, we presented an automated XR training bot assistant for teaching and learning. It involves the use of a welding torch, an RGBD camera, an HDR camera, a VR headset, and a computer. By using this multi-sensor system, the seam can be accurately located. The instant welding spot from the electric arc is recognized and immediate automated XR advice is provided accordingly. Score, errors, and the welding trajectory are displayed on-screen to provide intuitive results. This proposed method has been proven with experiments that can allow beginners to understand and acquire the welding techniques effectively. Additionally, compared to the state-of-the-art learning-teaching approach, the instructor can have a clearer and more convenient way to demonstrate the process.

However, there are a few limitations. First, only a straight line with a workpiece that has at least 5-mm thickness can be located accurately. Second, the angle of the welding torch is difficult to control due to the limited camera angle. It may easily fail to join two workpieces together without a filler. In the future, more sensing devices could be included to detect the pose of the welding torch and improve the robustness of the XR bot trainer. An automated motorized multi-sensor robot will be built to trace the welding region and the welding spot instantly, providing a better field of view during the process. The reported multi-sensor system can also be used to guide a robotic system [dna2020_front_neuro] in an automatic welding application; we are currently working in this direction.

References