Next Article in Journal
Free Vibration Characteristics Analysis of Metal-Rubber Cylindrical Shells Based on Viscoelastic Theory
Previous Article in Journal
Synthesis of MoS2 Using Chemical Vapor Deposition and Conventional Hydrothermal Methods: Applications to Respiration Sensing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of a Virtual Reality-Based System for Simulating Welding Processes

1
Faculty of Mechanical Engineering, University of Transport and Communications, Hanoi 100000, Vietnam
2
Faculty of Electrical-Electronic Engineering, University of Transport and Communications, Hanoi 100000, Vietnam
*
Author to whom correspondence should be addressed.
Submission received: 16 April 2023 / Revised: 10 May 2023 / Accepted: 13 May 2023 / Published: 15 May 2023
(This article belongs to the Section Mechanical Engineering)

Abstract

:
Arc welding processes, such as shielded metal arc welding (SMAW), metal inert gas (MIG), and tungsten inert gas (TIG), play an important role in industrial applications. To improve the efficiency of the exploitation of traditional welding systems, new technologies have been used. Virtual reality technology is one of them. The virtual reality (VR)-based welding system enables to increase the frequency of practice to help learners obtain welding experience to avoid errors that occur during actual welding processes. This paper presents a VR-based system for simulating three welding processes: SMAW, MIG, and TIG. The developed system includes hardware components and VR software installed on a computer. The change in the physical devices, such as moving the welding torch and the distance from the welding torch to the plates to generate the weld bead, will update in real time and appear on the virtual environment. The functionality of the developed system for simulating the welding processes, such as in the real welding environment, was tested successfully. For implementing the system, welding speed and the distance from the welding torch to the plates are important process parameters, which determine the weld size or the weld formation. In this research, the ranges of the welding speed are 70 ÷ 120 mm/min; 240 ÷ 460 mm/min; and 250 ÷ 450 mm/min for the SMAW, TIG, and MIG processes, respectively. These values were tested experimentally. The distance from the welding torch to the plates to display the weld joint is 1.5 ÷ 5 mm. Outside of this range, no weld joint is formed. The welding widths are 4.4 ÷ 12.9 mm, 7.1 ÷ 12.4 mm, and 7.4 ÷ 11.3 mm for the SMAW, TIG, and MIG processes, respectively.

1. Introduction

Welding processes, such as shielded metal arc welding (SMAW), metal inert gas (MIG), and tungsten inert gas (TIG), are widely used in many industrial applications. These welding processes have an important role for the fabrication and assembly of metal structures. The SMAW welding process is a manual arc welding process with a consumable electrode for generating the weld joint. The MIG welding process also uses a consumable electrode, while the TIG process uses a nonconsumable electrode. These welding processes are complex and often require several trials before they can be performed right. For simulating the welding processes to fully understand their mechanisms, virtual reality technology was selected [1]. With a virtual reality-based welding simulator, learning welding processes can be made easier and faster [2].
Virtual reality (VR) is an artificial environment or computer-generated virtual environment with the association of hardware to give the impression of a real-world situation to the user [3,4,5]. Currently, VR systems have been applied in many fields, such as manufacturing, business, science, medicine, education, art, public safety, military, and entertainment [6]. A visualization in three dimensions (3D) and the processing of user interactions in real time are mandatory for the VR systems [7,8,9].
With the goal of helping learners understand the welding mechanisms as well as knowing the errors that will lead to poor welding quality and to minimize errors before practicing on real welding machines, from the literature review, many welding simulation systems have been developed.
For simulating the SMAW welding process, Musawel developed a virtual welding system using ultrasound, infrared, optical rotary encoder, linear variable differential transformer (LVDT), and inertial measurement unit (IMU) sensors. The VR system consists of helmet, touch screen, welding gun, coupons, computer, and converters considered as hardware, and the simulation environment consists of software [1]. Kobayashi et al. also developed a simulator of an SMAW welding process with haptic display in which the virtual electrode providing the haptic sense is realized by the electromagnetic. The electrode position is measured by the image-based registration [10].
For simulating the MIG welding process, the VR-based MIG systems were proposed [11,12,13,14,15]. These systems include hardware components, such as helmet, welding torch, workpiece, computer, head-mounted display, tracking system for both the welding torch and the user’s head, and external audio speakers. The track model of the welding simulator uses single-camera vision measurement technology to calculate the position of the welding torch and helmet. The simulation model uses a simple model method to simulate the weld geometry based on the orientation and speed of the welding torch.
Price et al. introduced the commercial VRTEX 360 system by Lincoln Electric. This system is helpful for training welding skills before carrying out real welding [16]. Rusli et al. introduced a mobile arc welding learning app using mobile technology and enhanced with augmented reality for learning arc welding [17]. Wang et al. proposed a haptic arc welding training method based on VR and haptic guidance. The training system mainly consists of three parts: a phantom desktop haptic device with six degrees of freedom (DOF) of position sensing and three DOF of force feedback, a virtual arc welding model, and a welder training model [18].
The virtual reality environment is a simulation-based method used to help engineers make decisions and establish controls by simulating various engineering activities, such as assembly, production line operations, and machining [19]. Choi et al. applied virtual assembly tools for improving product design [20] In their study, with the integration of computer-aided design/computer-aided manufacturing (CAD/CAM) and computer-aided engineering (CAE) systems, sophisticated geometric models can be analyzed in a virtual environment. This is essential for the analyses of products that are difficult to access, too dangerous, or too expensive to practice on in real life. Firstly, three-dimensional solid models of the components of the grill plate assembly were created using CAD software, and these models were then converted into a standard triangle language (STL) file format in order to use in a computer software called DYNAMO. DYNAMO was used to establish the assembly sequence. The acceptable assembly sequence was examined for changes to produce an optimum sequence with a minimum assembly time. An application of a virtual assembly tool was demonstrated, and the assembly time was reduced by 3 s.
Another virtual reality-based system was developed by Wang et al. using the Unity3D platform [21]. The geographic information system is obtained by global positioning system (GPS) measurements. Then, all the geographical entities are converted to three-dimensional (3D) models by AutoCAD and 3ds Max software. These 3D models are imported into Unity3D and programmed with Java script language to achieve game objects and scenes. The scenes are integrated and published on the network. The attribute data of the study area are stored by MySQL, which is connected with the Unity game platform by the external interface.
From analyzing the virtual reality systems mentioned above, we can recognize that for carrying out VR-based systems, core technologies should be used as follows: real-time 3D computer graphics, wide-angle stereoscopic displays, viewer tracking, hand and gesture tracking, binaural sound, haptic feedback, and voice input/output; the first three technologies are mandatory [22].
For programing the VR systems, Vergara et al. introduced a review on the capabilities of different computational software for designing VR applications, such as Unity3D, Unreal Engine, and hardware devices for implementing VR applications [23]. Other tools were also used for designing VR applications, such as LabVIEW [24], VIRTOOLS [25,26], and Vuforia tracking engine [27].
From the literature review and inheriting the research results of the research groups presented above, we propose a virtual reality system for simulating three welding processes, namely the SMAW, MIG, and TIG welding processes.
The developed VR system provides a realistic welding experience of the actual welding processes for the user in which the user holds a real welding torch while watching and hearing a virtual weld. Thanks to practicing welding operations based on the virtual welding system, learners receive welding experience to minimize errors when carrying out real welding.
The experiences obtained by using the developed VR system include:
-
Because the welding practice is repeated many times with a change in welding parameters through changing the torch’s travel speed that is the welding speed, changing the distance from the torch to the plate, changing the torch angle with the plates and so on, the user feels the value range of the welding torch speed and the angle between the welding torch and the welded plates, which enable to establish the weld joint.
-
Understanding the mechanism of the SMAW, MIG, and TIG welding processes and the corresponding welding process parameter values thereby improving their welding knowledge and avoiding welding errors when the user carries out real welding.
The welding errors that users can learn from the developed VR system to avoid during the actual welding process include.
-
Unable to form the weld joint because the welding parameters are out of the allowable range for the SMAW, MIG, and TIG welding processes.
-
Unsatisfactory weld size: Although the welding process parameter values are in the allowable range, the used values are not suitable because the welding speed is too high or too low in the case of the wire feed rate being too low or too high.

2. Materials and Methods

2.1. Architecture of the Virtual Reality-Based System

Figure 1 shows the architecture of the virtual reality-based system for simulating welding processes. The system enables to simulate three welding processes, including SMAW, MIG, and TIG, in connecting with the physical system. By a change in the physical system, such as moving the welding torch or the distance between the welding torch and welded plates, there will be a change in weld formation in the virtual reality environment.
In consideration of the engineering requirements of the VR system, components, and architecture of the VR system, including software and hardware and how to build a VR system [9,28,29,30], we proposed a VR-based system including four modules as follows:
  • Module #1 was responsible for controlling the infrared radiation source and controlling the stepper motor to shorten the welding rod. Module #1 included microcontroller unit #1 (MCU#1), infrared emitter, and stepper motor. MCU#1 received the control signals from computer to activate the infrared emitter and stepper motor and sent feedback signals to computer. So, the connection between MCU#1 with computer was two-way connection via recommended standard 485 (RS485) port, which is a communication standard defining the electrical characteristics of drivers and receivers for use in serial communication systems [31]. MCU#1 used the 5 volts of direct current (5V-DC) power source. To emit the infrared signals for the SMAW, MIG, and TIG welding process, the infrared emitters were attached on the welding torches for each welding process. Stepper motor was used to describe the truncation of the welding filler rod in the case of the TIG welding or the electrode wire or weld stick in the case of the MIG and SMAW welding processes. For emitting the infrared signals, the 850 nm light-emitting diode (LED) infrared light was utilized.
  • Module #2 was responsible for receiving and sending the infrared radiation source to the VR module. Module #2 included microcontroller unit #2 (MCU#2) and two circuit boards. In this module, there were two circuit boards that were considered as the two welding plates. Each circuit plate was attached to 32 infrared radiation (IR) sensors, which were radiation-sensitive optoelectronic components with a spectral sensitivity in the infrared wavelength range of 780 nm ÷ 50 µm [32]. These sensors were responsible for receiving the infrared source from the IR emitter attached on the welding torch. MCU#1 received the analog signals from the IR sensors and processed these signals to data as the current position of the welding torch, the distance from the welding torch to the plates, and the welding energy. Then, MCU#2 sent these data to the computer for describing the weld bead. So, the connection between the MCU#2 with the computer was one-way connection via RS485 port. To perform their functions, MCU#2 used the 5V-DC power source, while two circuit boards used the 8V-DC power source.
  • Module #3 was the VR-based simulation, which was responsible for simulating three welding processes (SMAW, MIG, and TIG). To simulate the virtual reality environment, the real 3D models for describing the welding processes, such as SMAW, MIG, and TIG, were scanned using the 3D scanner. Then, these point cloud models were converted to the 3D solid models using CAD software—Catia V5 R21. To develop the VR environment, the generation of the 3D models played an important role [33,34]. The filler rod appeared when simulating the TIG welding process. With the MIG and SMAW welding processes, the electrode wire or the weld stick appeared, respectively. The scenes showing the welding environments, such as in the factory and in the laboratory, were also added to the virtual reality environment. These models were inputted to the virtual reality simulation using the Unity3D platform. The functionality of the VR system as simulating the welding processes was programmed using the C# language on the Unity3D platform. The VR module connected with the MCU#1 and MCU#2 via the RS485 port and with the 3D glasses via the universal serial bus (USB) port, which is an industry standard for communicating computer with the peripheral devices [35].
  • Module #4 was responsible for displaying the welding processes to 3D glasses. With this module, user could interact directly with the welding environment in a virtual reality environment using 3D glasses instead of observing the welding process on the computer [36]. Connection of the VR module with 3D glasses was via the USB port. The description of the welding process and the welding process information appeared directly in front of the user’s eyes using 3D glasses.

2.2. Systematic Procedure for Developing the Virtual Reality-Based System

For developing the virtual reality-based welding system to simulate the SMAW, MIG, and TIG welding processes, the systematic procedure illustrating the sequence of steps for developing the VR system is shown in Figure 2.
In the first step, a module for controlling the infrared radiation source and the stepper motor was developed. This was module #1 of the system with the physical devices, including the MCU#1; IR emitters attached on the SMAW, MIG, and TIG welding torches; stepper motor; and power source converters.
MCU#1 was designed and fabricated with the aim of controlling the infrared radiation source and the stepper motor, receiving and sending data to a computer via the RS485 port. Algorithm for controlling the IR source and the stepper motor was proposed and implemented. The connections of MCU#1 with the VR module, IR emitters, and stepper motor were established.
In the second step, a module for receiving the IR signals was developed and named module #2 in the system. This module #2 with several physical devices consisted of MCU#2, two circuit boards attached to IR sensors for receiving the IR signals from the IR emitter attached on the welding torch, and power source converters.
MCU#2 was used for receiving the infrared radiation source and sending data to the computer via the RS485 port. MCU#2 and two circuit boards were designed and fabricated. The algorithm for receiving the IR signals and processing these data was proposed and implemented. The connections of MCU#2 with the VR module and two circuit boards were established.
In the third step, the VR module was programmed and connected with modules #1, #2, and #4. The VR module was module #3 of the system. To develop this module, the SMAW, MIG, and TIG welding processes were analyzed to have the 3D components as well as determine the input parameters, output parameters, and their allowed values for each welding process. These 3D components were generated by using the 3D scanner for scanning real welding devices. Then, these models and scenes were inputted to the Unity3D platform for generating the virtual reality environment. Once the system had all the integrated modules, the functionality of the system was tested and evaluated.

2.3. Development of Module #1

The central control circuit block MCU #1 was responsible for controlling the infrared radiation source and controlling the stepper motor to shorten the welding rod. The architecture of module #1 is shown in Figure 3. MCU #1 received the control signal from the computer via the RS485 port using the Modbus remote terminal unit (RTU) communication protocol block, from which it processed and outputted different radiation levels (infrared-emitting) and welding rod-shrinking speeds (moving electrode wire). The architecture of MCU #1 is shown in Figure 4.
For controlling the infrared radiation source, the signal from channel #9 (PWM_M) was used, and for controlling the stepper motor, the signals from channels #6 and #7 were used. Channel #9 was connected to the infrared transmitter attached on the welding torch.
MCU#1 controlled the amount of IR by changing the voltage for the infrared 850 nm LED. This LED, considered as the IR emitter, operated between 840 and 850 nm and worked well for generic IR systems with emitting angles of 30 degrees and utilizing a power source of DC 1.4–1.6 V and 20 mA.
The dimensions of the LED infrared 850 nm are shown in Appendix A. The function for controlling the IR emitter is detailed in Appendix B.
The principle for MCU#1 controlling the IR levels as shown in Figure 5 is as follows:
The MCU#1 provided variable PWM, through the resistor-capacitor (RC) filter, which is an electric circuit composed of resistors and capacitors to create a DC voltage for operational amplifier 2 (OPAM_2). The output voltage was fed back through the operational amplifier 1 (OPAM_1), which was then fed to the inverting input of OPAM_2. OPAM_2 performed voltage integration. If the output voltage dropped, resulting in a noninverting input with a voltage lower than the input pulse width modulation (PWM) voltage, then the OPAM_2 output decreased, causing the Vref pin reference voltage of the linear monolithic-2596 (LM2596) to decrease and the linear monolithic (LM2576) to increase in pulse width to compensate. Thus, the output voltage of LM2596 was always kept stable corresponding to the fixed PWM from the output of MCU#1.
Channels #20 and #21 of MCU#1 were used for connecting to computer via RS485 port. Circuit diagram of the communication block for connecting between MCU #1 and computer is shown in Figure 6 where channels RS485_OUTB and RS485_OUTA were used for connecting RS485 port with computer; channels RX_485, DK_RS485, and TX_485 were used for connecting RS485 port with MCU #1. The function for communicating the computer via the RS485 port is detailed in Appendix C.

2.4. Development of Module #2

For receiving the infrared radiation source, circuit block MCU #2 was used (as shown in Figure 7). Two circuit boards were considered as two material plates in weld joint. Each circuit was attached to 32 infrared radiation sensors, which were the IR phototransistors (code: PT15-21B/L14/TR8) with specifications of voltage 30 V, current 0.3 mA, wavelength 940 nm, and power 75 mW.
These sensors (as shown in Appendix D) received the infrared radiation signal from the infrared transmitter attached on the welding torch. MCU #1 processed the signals from two circuit boards and sent them to the computer via RS 485 port.

2.5. Algorithm for Processing the Signals from IR Sensors

The 32 IR sensors attached on the circuit were responsible for receiving the signals from the IR transmitter, and the analog signals from these sensors were sent to the computer via MCU#2. Calculation of the signal values for the 32 IR sensors is presented in Appendix E. Figure 8 shows the sensor arrays and their signal values (vi). The distance between two sensors is 3.001 mm. To have the parameters for simulating welding processes from these signals, an algorithm for processing the sensor’s signals was proposed as shown in Figure 9. From the signals, a median filter was performed to remove the noise. The median filter function from the signal values of the 32 IR sensors can be seen in Appendix F. Then, the five largest signal values that were close together were filtered. From these values, the sensor’s current position of the welding torch was determined as follows:
S = i = k k + 5 i . v i i = k k + 5 v i
The function to determine the S position was programmed as detailed in Appendix G. From the value of the sensor’s position (S), the distance from the start point to the current position of the welding torch (x) was calculated. In order to simulate the welding process, in addition to the x value, in this work, the dependent of the welding energy on the distance from the nozzle to the plates was also considered.
In the real welding process, the distance from the electrode to the welded plates affects the arc energy formation. With a large distance, small arc energy is established and vice versa. Applying this principle to the developed VR system, the distance from the IR transmitter attached on the welding torch to the IR receivers attached on the circuit boards and the angle between the welding torch and the plates affected the weld formation. So, to establish the weld joint in consideration of the arc welding energy, the angle of torch with the base metal was selected from 70 to 80 degrees (or from 10 to 20 degrees perpendicular to the base metal), and the distance was from 1.5 to 5 mm. The angle between the filler rod with the base metal was selected at 20 degrees [37]. In this range of the angle and distance values, the receivers could recognize the energy of signals from the IR transmitters.
Figure 10 depicts parameters for simulating the welding process, which are the current position of the welding torch (x), the welding energy (Ew), and the distance from the nozzle to the plates (r). For simulating, the value of the distance from the nozzle to the plates was assumed to equal 1/5 of the welding energy. The function for calculating the welding torch position, distance from the welding torch to the plates, and the welding energy is shown in Appendix H.

2.6. Development of the Virtual Reality Module (Module #3)

To build the virtual reality environment for describing the SMAW, MIG, and TIG welding processes, it was important to analyze the mechanisms of these welding processes to obtain the main parts of each welding process (Figure 11). Based on analysis of the SMAW, MIG, and TIG welding processes, the 3D CAD models of these welding processes were designed using CatiaTM tool. These models were converted to the STEP or IGES format. Then, using the Unity3D tool, we were able to obtain these models in the virtual environment.
Unity3D was used as a virtual reality development platform supporting three scripting languages, including JavaScript, C#, and Python, as well as supporting the Windows platform, VB.net, VB6, Delphi, and other programming languages [21,38]. In this work, the programming language C# was used for programing the virtual environment. To achieve the models and scenes in the virtual environment, the 3D models were imported into the Unity3D. Then, a program was generated to achieve the objects and scenes [21]. Modeling plays an important role in the VR system. To generate 3D models and scenes for describing the SMAW, MIG, and TIG welding processes, parts of each welding component were scanned using a 3D scanner machine. From the scanned data, which were in point cloud format, the 3D models were generated using CatiaTM software.
For generating the virtual reality environment, an algorithm was proposed as shown in Figure 12 with two stages: (1) modeling the 3D models, which show components of the SMAW, MIG, and TIG welding processes and (2) converting these models to virtual reality modeling language (VRML) models [28,39]. To implement the VR module, this research used Unity3D platform with C# language for programming to draw the weld joint as well as describe the welding process as detailed in Appendix L and Appendix K, respectively. Some scenes in Unity3D were employed by adding UnityEngine.dll as presented in Appendix K.
For connecting module #3 to other modules, the function for transferring the data from modules #1 and #2 to module #3 via the RS485 port is described detail in Appendix I. The function for module #3 obtaining the data from other modules is shown in Appendix J.

2.7. Simulation of the Weld Bead Geometry

When simulating the welding process, only the surface of the weld bead is visible, which shows the bead width and the bead reinforcement.
To describe the surface of the weld bead in the virtual environment, the cross-section of the weld bead is approximately parabolic as shown in Figure 13 [11,40].
Z = a Y 2 + h
where Y is the position of the welding torch in Y direction and b represents the width. h is the maximum height of the weld bead, and a is a constant. The values of a and h were determined as follows [14]:
a = 4 D 2 f Δ t b 4
h = D 2 f Δ t b 2
In these equations, D (mm) is the diameter of the electrode wire in the case of the MIG welding process or the diameter of the weld stick or filler rod in the case of the SMAW and TIG welding processes, respectively. The wire feed rate is f (mm/min). Δt is time step (min).
By substituting Equations (2) and (3), the relationship between the height of the weld bead (Z) with the bead width (b) and the position of the welding torch (Y) in Y direction can be expressed in Equation (5).
Z = 4 D 2 f Δ t b 4 Y 2 + D 2 f Δ t b 2
In consideration of the influence of welding process parameters, the weld bead width for each welding process was determined as follows [41]:
For the SMAW welding process:
b = 9.41 − 0.0072 A + 0.064 B − 0.00697 C + 0.123 D + 0.293 E
where A: welding current (A), B: arc length (mm), C: welding speed (mm/min), D: electrode diameter (mm), and E: joint gap (mm).
For the MIG welding process:
b = −6.502 − 0.1 p + 0.57969 m + 0.5813 f + 0.01562 d − 0.05375 e + 0.00271 n
where p: welding speed (cm/min), m: arc voltage (V), f: wire feed rate (m/min), d: gas flow rate (L/min), e: nozzle to plate distance (mm), and n: torch angle (degree).
For the TIG welding process:
b = −0.79 − 0.1468 M + 0.01713 O + 0.654 P + 0.1146 Q
where M: welding speed (cm/min), N: wire feed rate (cm/min), O: % cleaning, P: joint gap (mm), and Q: welding current (A).

3. Results and Discussions

3.1. Implementation of the MCU and Circuit Boards

The microcontroller units (MCU #1 and #2) and two circuit boards were designed and fabricated as shown in Figure 14 and Figure 15, respectively. Then, these devices were connected to a computer to obtain the functionality of the developed system. Figure 15 presents the circuit board with 32 IR sensors attached, and the distance between two adjacent sensors equals 3.001 mm.
The specifications of the fabricated MCU were as follows: 32-bit subsystem, voltage 1.71–5.5 V-ADC, 48-MHz Arm Cortex-M0 CPU with single-cycle multiply, 44-pin, and RS485 port communication.
The specifications of the fabricated circuit board for reading the IR signal were as follows: operating voltage 7.5–24 V, 32 ADC input channels, Modbus RTU RS485 communication baud rate: 115,200, and one LED for operating status.

3.2. Implementation of the Virtual Reality-Based System

With the fabricated devices and programmed VR module, the integration of these modules and devices was carried out to have the virtual reality-based system for simulating welding processes as shown in Figure 16. Plates #1 and #2 were circuit boards, which were attached IR sensors. These plates were placed to have other types of the weld joint, such as fillet joint, butt joint, and lap joint. If the weld type changed, all these programmed modules and the circuit board were still applicable. In Figure 16, two plates were placed to have the fillet weld joint. Three welding torches were for the SMAW, TIG, and MIG welding processes. These torches connecting to MCU #1 were attached to the IR transmitter.
The functionality of the system was as follows:
  • Simulating three welding processes (SMAW, MIG, and MIG) in real time: If there is a change in the physical devices, such as moving the welding torch, the weld joint will appear if the position of the welding torch, the distance from the welding torch to the plate, and the arc length are in the allowed value range.
  • Describing the mechanisms of the welding processes, weld joint, and welding process parameters: The VR system enables to describe the weld size when the user changes the welding speed by moving the welding torch. For example, the weld size will increase if there is a decrease in welding speed. The position of the welding torch relative to the plate affects the formation of the weld joint. This position is not only determined by the welding speed but also by the distance from the welding torch to the plates and the torch angle with the plates. These parameters will affect the welding energy. If the distance is too high, the welding energy is low, and vice versa. So, to describe the weld joint, the distance from the welding torch to the plates is 1.5 ÷ 5 mm. The angle of the torch with the base metal is 70 ÷ 80 degrees. The other welding process parameters are shown in Table 1, Table 2 and Table 3.
  • Enabling to connect to other computers in the network, 3D glasses.
For implementing the VR system, the welding parameters and their values were considered by carrying out experiments for the SMAW, MIG, and TIG welding processes using ARC 200, MIG 200, and TIG 200 welding machines, respectively. The experiments were carried out to determine which welding process parameters affect the weld joint formation. The welding process parameters for the SMAW, MIG, and TIG welding processes are shown in Table 2 and Table 3.
From these experimental data, the regression models showing the relationship between the welding process parameters with the welding size (or the weld bead width) were established as shown in Equations (6)–(8). From these models, the main factors of the welding process parameters were determined and considered as the input parameters as shown in Table 1, Table 2 and Table 3 for showing the weld joint. These mathematical models were used for programming the VR system to show the weld joint formation. When the minimum and maximum values show the range of the welding process parameters, the weld joint will be established. Outside of these values, the weld joint does not appear.

3.3. Simulation Results and Discussions

3.3.1. Simulation Results

The functionality of the developed system was tested successfully as shown in Figure 17. For simulating the SMAW, MIG, and TIG welding processes, we inputted the value range of the welding process parameters. The position change of the welding torch in the real environment updated in real time in the virtual environment.
According to the allowed value range of the welding process parameters for SMAW, MIG, and TIG, when we moved the real welding torch, the data, including the distance from the welding torch to the plates, the welding torch position, and the welding energy, were sent to the computer. If these data were in the allowed range, the weld joint appeared; otherwise, the weld joint failed.
The developed system enables to describe the welding process correctly and shows the welding process information when the welding process is carried out. We can simulate the welding types, such as the fillet joint, butt joint, and lap joint, by changing the position of the welding sample plates in hardware, changing the 3D model in the software, and changing the welding process parameters. The simulation results include the weld joint, which was established or not; the weld size in the case of the established weld joint; and the welding process parameters.
Figure 18 shows the results during the TIG welding process. The TIG torch (hardware) was used and shown on the virtual environment (computer and 3D glasses). For carrying out the fillet weld joint, two sample welded plates, which were circuit boards attached to the IR sensors, were placed perpendicular, and in the virtual environment, the 3D model with two plates was also placed perpendicularly like the physical environment. With the welding process parameters, the distance from the torch to the plates and the angle between the torch with the plate were within the allowed values; and the fillet weld joint was established, which is the weld line, the weld size, the welding arc light, and the welding sound. The welding process parameters include the welding position (or weld type), plate thickness, welding material, electrode diameter, welding current, torch angle, wire federate, inert shielding gas, gas flow rate, welding speed, and arc length.
For simulating more easily, the welding process was sent to 3D glasses. With this application, we could watch the weld formation directly in real time as we moved the TIG welding torch, changed the torch angle, and changed the angle between the torch and the plates.

3.3.2. Discussions

With the simulation results, the system allowed us to describe correctly the welding profile and the weld size. To prove these functionalities, simulations of the fillet weld joint were carried out. Figure 19 shows the dimensions of these plates.
The simulation was carried out with the plate thickness changed as shown in Table 4. In comparison with the required results for the fillet weld joint [45,46], the simulated fillet weld profiles were correct. The difference in the weld sizes between the standard dimensions and the simulated dimensions generated by the developed virtual reality-based system was acceptable with errors of less than 9% for all the experiment cases.
With the target to help the user carry out the real welding process with better-quality joints by performing smoother welding with fewer mistakes, less inquiry time, and fewer hints [1,11,47], the functionality of the developed system was satisfied in comparison with the reported systems [1,11,47] and the commercial systems, such as OcuWeld [36] and VRTEX 360 [16,48]. These systems provide a real-time welding processes and realistic welding practice allowing for the interaction of the user with the welding components as in the real world. However, these systems also have some differences in the functionality as well as the technologies applied for implementation. Some similarities and differences of these systems are shown in Table 5 and are explained as follows:
-
Functionality: With some of the reported systems in the literature, these systems enable to simulate only one kind of welding process, such as SMAW (or MMAW—manual metal arc welding) [1,47] and MIG (or GMAW—gas metal arc welding) [11,13,14]. Some of the commercial systems enable to simulate more kinds of welding processes, such as the SMAW, MIG, and TIG welding processes [16,36,48]. With the objective of developing a VR-based system for education, more kinds of welding processes are necessary. So, we proposed the VR-based system that enables to simulate three welding processes: SMAW, MIG, and TIG.
-
Tracking the welding torch position: This is one important module for the development of the VR system. Currently, there are some devices applied, such as using cameras [13,14,16,36,48], Vive trackers [47], sensors [1], or both cameras and sensors [11]. With the camera for the tracking module and for processing data, an algorithm for processing the image data was used. With sensors for the tracking module and for processing data, an algorithm for processing the analog signals was used. In this research, the IR transmitters and IR sensors were used for tracking the welding torch position. The algorithm for processing the analog signals was proposed.
-
Virtual reality module: For programming the virtual reality environment, some platforms and software were used, such as VB.net 2010 [1], Vive input Utility, and SRWorks software development kit [47]. In this research, the Unity3D platform and C# language were used; the PsoCreator was used for programming the connection of the VR module with the other modules.
-
Visualization in 3D and processing of the user interactions in real time: This is a mandatory requirement of the VR system. So, all the VR systems (the reported systems and the commercial systems) have this functionality.
-
Electrode is stuck to the welded plates: When the distance from the electrode to the welded plates is too low, this phenomenon will happen. Some systems have this functionality [11,16,36,48]. Our system will consider this functionality in future work to improve the developed system.
-
Helmet and 3D glasses: Some systems use a helmet [1,11,13,14,16,47,48] or 3D glasses [36]. In this research, we used 3D glasses for watching the welding process directly.

4. Conclusions

In this research a virtual reality-based system for simulating the SMAW, MIG, and TIG welding processes was developed. The developed system enables to describe the welding processes in the virtual environment, which is similar to real welding processes. The specifications of the virtual reality-based welding system are as follows:
-
The system includes hardware and software. The hardware devices include three welding torches for carrying out the SMAW, MIG, and TIG welding processes, fixtures, welding sample plates attached to circuit boards, circuit board with microcontroller unit, 3D glasses, and one computer. The software is a virtual reality module installed on the computer, which enables to connect with the hardware devices to simulate the welding processes.
-
The welding speed is in a range from 70 mm/min to 460 mm/min; the welding joint width is from 1.5 mm to 12.9 mm; the selection of these value for simulation depends on the type of welding process. The value ranges of other welding process parameters are shown in Table 1, Table 2 and Table 3.
-
The system enables to simulate three welding processes with different types of weld joints, such as fillet joint, butt joint, and lap joint, by changing the position of the welding sample plates attached to a circuit board with the IR sensors.
For implementing the VR-based system, hardware components and VR software were designed, fabricated, programed, and tested. With the hardware components, two microcontroller units (MCUs), two circuit boards attached to 32 infrared radiation sensors, and the physical parts, such as the welding torches and weld fixtures, were designed and fabricated. With the VR software and with the 3D models showing parts of each welding processes, such as welding torch, electrode, filler rod, and base material plates, the virtual environment was established. The Unity3D tool was used for programming the simulation function of the VR system. To connect the modules of the system, communication protocols, such as RS485 and USB ports, were used. Algorithms for controlling and processing the data were proposed. The PsoCreator tool was used for programing the control and communication modules. The functionality of the developed system was tested successfully.
The contributions of this research include:
  • Establishing a virtual reality-based system for simulating three welding processes including the SMAW, MIG, and TIG with a visualization in three dimensions (3Ds) and processing of user interactions in real time where the weld joint is described in the virtual environment when we moved the welding torch or changed the distance from the welding torch to the plates in the real environment. The weld joint and the welding process parameters were shown on the computer and the 3D glasses.
  • Proposing the algorithms for controlling and processing data, such as sending and receiving the infrared radiation and converting the analog signals to have the parameters for describing the weld bead, such as the position of the welding torch, the distance from the welding torch to the plates, and the welding energy.
Studying the functionalities, such as describing the cross-section of the weld bead to evaluate the weld joint, integrating artificial intelligence for predicting the quality of the weld joint, and updating the weld materials, is considered as the future work of this research.

Author Contributions

Conceptualization, N.-H.T.; Data curation, V.-H.B.; Formal analysis, V.-H.B.; Methodology, N.-H.T. and V.-N.N.; Software, V.-N.N.; Validation, N.-H.T. and V.-N.N.; Writing—original draft, N.-H.T. and V.-H.B.; Writing—review and editing, N.-H.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the VIETNAM MINISTRY OF EDUCATION AND TRAINING, grant number: B2021-GHA-01.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Dimensions of the LED infrared 850 nm (unit in mm):
Applsci 13 06082 i003

Appendix B

Function for controlling the IR emitter and stepper motor:
static uint16_t pre_tmp = 0;
void control_execute (uint16_t value)
{
if (pre_tmp != value)
{
pre_tmp = value;
PWM_Control_WriteCompare(pre_tmp);
}
}
void controlLightRun() // emitting infrared radiation
{
if (pre_val_control != valCfg[ADR_REG_HOLDING_RADIO] && val_enable_control == 1 && isConnectedModbus == 1)
{
pre_val_control = valCfg[ADR_REG_HOLDING_RADIO];
control_execute(pre_val_control);
}
}
void controlStepMotorRun() // controlling stepper motor
{
if (valCfg[ADR_REG_HOLDING_ENABLE] == 1 && isConnectedModbus == 1 && valCfg[ADR_REG_HOLDING_PULE]>0)
{
DIR_SERVO_Write (1);
for (int i = 0; i < 10; i++)
{
STEP_SERVO_Write (0);
CyDelay (5);
STEP_SERVO_Write (1);
CyDelay (5);
}
}
}

Appendix C

Function for communicating the computer via the RS485 port:
void ModbusSlaver_Run(void) // communication
{
if (modbusCompHavedata)
{
if (ModbusSlaver_DecodeMsg (&packet_comp)==SUCCESSFUL)
{
Slaver_Do (&packet_comp);
Slaver_PrepareRespond (&packet_comp);
ModbusSlaver_Comp_SendFrame(packet_comp.buf_resp, packet_comp.len_buf_resp);
}
modbusCompHavedata=0;//clear flag
}
}

Appendix D

Dimensions of the IR phototransistor (unit in mm):
Applsci 13 06082 i004

Appendix E

Calculation of the signal values for the 32 IR sensors:
void analog_cd4052(uint8 sel_channel) // calculating analog signal
{
switch (sel_channel)
{
case 0:
ADC_val_arr_sample [ADC_CH_3] = ADC_IN_GetResult16(0);
ADC_val_arr_sample [ADC_CH_7] = ADC_IN_GetResult16(1);
ADC_val_arr_sample [ADC_CH_11] = ADC_IN_GetResult16(2);
ADC_val_arr_sample [ADC_CH_15] = ADC_IN_GetResult16(3);
ADC_val_arr_sample [ADC_CH_19] = ADC_IN_GetResult16(4);
ADC_val_arr_sample [ADC_CH_23] = ADC_IN_GetResult16(5);
ADC_val_arr_sample [ADC_CH_27] = ADC_IN_GetResult16(6);
ADC_val_arr_sample [ADC_CH_31] = ADC_IN_GetResult16(7);
ANA_0_A_Write (0);
ANA_0_B_Write (0);
ANA_1_A_Write (0);
ANA_1_B_Write (0);
ANA_2_A_Write (0);
ANA_2_B_Write (0);
ANA_3_A_Write (0);
ANA_3_B_Write (0);
break;
case 1:
ADC_val_arr_sample [ADC_CH_0] = ADC_IN_GetResult16(0);
ADC_val_arr_sample [ADC_CH_4] = ADC_IN_GetResult16(1);
ADC_val_arr_sample [ADC_CH_8] = ADC_IN_GetResult16(2);
ADC_val_arr_sample [ADC_CH_12] = ADC_IN_GetResult16(3);
ADC_val_arr_sample [ADC_CH_16] = ADC_IN_GetResult16(4);
ADC_val_arr_sample [ADC_CH_20] = ADC_IN_GetResult16(5);
ADC_val_arr_sample [ADC_CH_24] = ADC_IN_GetResult16(6);
ADC_val_arr_sample [ADC_CH_28] = ADC_IN_GetResult16(7);
ANA_0_A_Write (1);
ANA_0_B_Write (0);
ANA_1_A_Write (1);
ANA_1_B_Write (0);
ANA_2_A_Write (1);
ANA_2_B_Write (0);
ANA_3_A_Write (1);
ANA_3_B_Write (0);
break;
case 2:
ADC_val_arr_sample [ADC_CH_1] = ADC_IN_GetResult16(0);
ADC_val_arr_sample [ADC_CH_5] = ADC_IN_GetResult16(1);
ADC_val_arr_sample [ADC_CH_9] = ADC_IN_GetResult16(2);
ADC_val_arr_sample [ADC_CH_13] = ADC_IN_GetResult16(3);
ADC_val_arr_sample [ADC_CH_17] = ADC_IN_GetResult16(4);
ADC_val_arr_sample [ADC_CH_21] = ADC_IN_GetResult16(5);
ADC_val_arr_sample [ADC_CH_25] = ADC_IN_GetResult16(6);
ADC_val_arr_sample [ADC_CH_29] = ADC_IN_GetResult16(7);
ANA_0_A_Write (0);
ANA_0_B_Write (1);
ANA_1_A_Write (0);
ANA_1_B_Write (1);
ANA_2_A_Write (0);
ANA_2_B_Write (1);
ANA_3_A_Write (0);
ANA_3_B_Write (1);
break;
case 3:
ADC_val_arr_sample [ADC_CH_2] = ADC_IN_GetResult16(0);
ADC_val_arr_sample [ADC_CH_6] = ADC_IN_GetResult16(1);
ADC_val_arr_sample [ADC_CH_10] = ADC_IN_GetResult16(2);
ADC_val_arr_sample [ADC_CH_14] = ADC_IN_GetResult16(3);
ADC_val_arr_sample [ADC_CH_18] = ADC_IN_GetResult16(4);
ADC_val_arr_sample [ADC_CH_22] = ADC_IN_GetResult16(5);
ADC_val_arr_sample [ADC_CH_26] = ADC_IN_GetResult16(6);
ADC_val_arr_sample [ADC_CH_30] = ADC_IN_GetResult16(7);
ANA_0_A_Write (1);
ANA_0_B_Write (1);
ANA_1_A_Write (1);
ANA_1_B_Write (1);
ANA_2_A_Write (1);
ANA_2_B_Write (1);
ANA_3_A_Write (1);
ANA_3_B_Write (1);
break;
default:
break;
};
}

Appendix F

Median filter function from the signal values of the 32 IR sensors:
void ISR_ADC_HANDLE (void)
{
analog_cd4052 (channel);
channel++;
if (channel >= TOTAL_CHAN)
{
channel = 0;
for (uint8_t i = 0; i < ADC_TOTAL; i++)
{
analog_sum[i] += (2047 - (ADC_val_arr_sample[i] >0? ADC_val_arr_sample[i]:0));
}
if (++cnt_sample >= SAMPLE_ADC)
{
cnt_sample=0;
for (uint8_t i = 0; i < ADC_TOTAL; i++)
{
float result = 0;
result= analog_sum[i]/(SAMPLE_ADC*5);
analog_sum[i]=0;
analog_data[i] = (int16_t) result;
}
analog_haveData=1;
}
}
isr_adc_ClearPending();
}

Appendix G

Function for finding the S position:
uint8_t findWindowMaxAnalog (uint16_t *array_source)
{
uint8_t ret_start_position;
int32_t sub_array[SIZE_AUB_ARRAY];
int32_t max_sub_sum;
sub_array[0] = 0;
for (int i = 0; i < LENGHT_WINDOWS; i++)
{
sub_array[0] += array_source[i];
}
max_sub_sum = sub_array[0];
ret_start_position = 0;
for (uint8_t i = LENGHT_WINDOWS; i < ADC_TOTAL; i++)
{
int32_t tmp = sub_array[i-LENGHT_WINDOWS] + array_source[i]- array_source[i-LENGHT_WINDOWS];
sub_array[i-LENGHT_WINDOWS+1] = (tmp >0? tmp :0);
if (max_sub_sum < sub_array[i-LENGHT_WINDOWS+1])
{
max_sub_sum = sub_array[i-LENGHT_WINDOWS+1];
ret_start_position = i-LENGHT_WINDOWS+1;
}
}
for (uint8_t i = LENGHT_WINDOWS; i < ADC_TOTAL; i++)
{
sub_array[i] = 0;
}
return ret_start_position;
}

Appendix H

Function for calculating the welding torch position, distance from the welding torch to plates, and the welding energy:
void SlaverSerial_Update (Packet_Modbus_Slaver * msg)
{
uint16_t arr_tmp_input[ADC_TOTAL];
for (int i = 0; i < ADC_TOTAL; i ++) //=== NUMOFSTORAGE
{
memcpy (arr_tmp_input+i, &analog_data[i], 2);
}
uint8_t start_pos = findWindowMaxAnalog(arr_tmp_input);
switch(msg->funtion)
{
case READ_INPUT_REGISTERS:
input_reg[0] = start_pos;
for (int j = 0; j < LENGHT_WINDOWS; j++)
{
memcpy (input_reg +j+1, arr_tmp_input + start_pos + j, 2);
}
uint32_t sumRatioSen = 0, sumSen = 0;
for (uint8_t i = start_pos; i < start_pos + LENGHT_WINDOWS; i++)
{
sumRatioSen += i*(arr_tmp_input[i]);
sumSen += arr_tmp_input[i];
}
float pos_sen = ((float)sumRatioSen*10.0/sumSen);
input_reg[LENGHT_WINDOWS+1] = pos_sen;
 
if (SLAVER_ADRESS == 2)
{
input_reg[LENGHT_WINDOWS+2] = pos_sen*3.1;
}
else
{
If (pos_sen > 310) pos_sen = 310;
input_reg[LENGHT_WINDOWS+2] = (310 - pos_sen)*3.1;
}
input_reg[LENGHT_WINDOWS+3] = map(sumSen, 0, 2000, 150, 0);
input_reg[LENGHT_WINDOWS+4] = map(sumSen, 0, 2000, 0, 100);
break;
}
}

Appendix I

Function for transferring the data from the device to Unity:
builder.Host.UseWindowsService();
var app = builder.Build();
// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
app.UseForwardedHeaders (new ForwardedHeadersOptions
{
ForwardedHeaders = ForwardedHeaders.XForwardedFor | ForwardedHeaders.XForwardedProto
});
app.UseForwardedHeaders(new ForwardedHeadersOptions
{
ForwardedHeaders = ForwardedHeaders.XForwardedFor |
ForwardedHeaders.XForwardedProto
});
app.UseHttpsRedirection();
app.UseAuthentication();
app.UseRouting();
app.UseCors(option => option
.AllowAnyOrigin()
.AllowAnyMethod()
.AllowAnyHeader());
app.UseAuthorization();
app.UseIotMqttServer();
app.UseExceptionHandler(“/myerror”);
app.MapControllers();
app.UseEndpoints(endpoints =>
{
endpoints.MapMqtt(“/mqtt”);
});
app.Run();

Appendix J

Function for obtaining data from devices:
public VirWeldClientData_Model Interpolate (VirWeldDataRaw_Model raw)
{
VirWeldClientData_Model result = new VirWeldClientData_Model();
result.L_Position = raw.Analog1.L_Distance + raw.Analog2.L_Distance;
result.L_Position /= 2;
result.D_Position = raw.Analog1.D_Distance + raw.Analog2.D_Distance;
result.D_Position /= 2;
result.WeldArc = raw.Analog1.weldEnergy + raw.Analog2.weldEnergy;
result. WeldArc /= 2;
result.K_Position = raw.Analog1.K_Distance + raw.Analog2.K_Distance;
result.K_Position /= 2;
result.Device1State = raw.Analog1.ConnectState;
result.Device2State = raw.Analog2.ConnectState;
result.DeviceStepState = raw.StepConnectState;
return result;
}

Appendix K

Program for describing the welding process:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class WeldController : MonoBehaviour
{
public static WeldController Instante;
public GameObject weldTorch,weldJoint;
public Transform weldStick;
public GameObject posStart;
public float smoothl_Position = 0.3f;
public float tiLe = 10f;
public ParticleSystem objweldArc;
public NightVisionController nightVisionController;
public GameObject audioWeld;
// Start is called before the first frame update
void Start()
{
Instante = this;
temp = weldTorch.transform.position;
max_l_Position = max_l_Position / tiLe;
min_l_Position = min_l_Position / tiLe;
}
public Vector3 target;
public float speed;
public Vector3 temp;
public float temp_l_Position;
public float max_l_Position; //larger than 5 mm, weld joint does not appear.
public float min_l_Position;// from 1.5–5 mm, weld joint appears, smaller than 1.5 mm weld joint does not appear.
public Vector3 posHit;// position of the weld joint
void Update()
{
float step = speed * Time.deltaTime;
weldTorch.transform.position = Vector3.MoveTowards(weldTorch.transform.position, target, step);
CheckForDestination();
 
}
public void SetPosWeldTorch(float l_Position,float d_Position,float k_Position, float weldArc)
{
l_Position = l_Position / tiLe; d_Position = d_Position / tiLe; k_Position = k_Position / tiLe;
target = new Vector3(posStart.transform.position.x + l_Position, posStart.transform.position.y, posStart.transform.position.z);
print (Vector3.Distance(target, temp));
if (Vector3.Distance(target, temp) > (smoothl_Position / tiLe))
{
weldTorch.transform.position = target;
}
temp = weldTorch.transform.position;
var main = objWeldArch.main;
main.startSize = (weldArc / tiLe)*2; //establishing the arc length.
TurnOnOff(d_Position);
}
/// Turn on/ Turn off the weld process
void TurnOnOff(float d_Position)
{
if (temp_l_Position == 0)
{
temp_l_Position = d_Position;
}
if (d_Position > max_l_Position)
{
//Turn off
if (weldJoint.GetComponent<Collider>().enabled)
{
weldJoint.GetComponent<Collider>().enabled = false;
var emission = objWeldArc.emission;
emission.enabled = false;
//nightVisionController.ToggleNightVision(false);
audioWeld.SetActive(false);
}
}
else
{
if (d_Position > min_l_Position)
{
//Turn on
if (!weldJoint.GetComponent<Collider>().enabled)
{
weldJoint.GetComponent<Collider>().enabled = true;
var emission = objWeldArc.emission;
emission.enabled = true;
audioWeld.SetActive(true);
// nightVisionController.ToggleNightVision(true);
}
}
}
}
//Determining the position of the weld arc
void CheckForDestination()
{
Ray ray = new Ray (weldStick.transform.position, weldStick.transform.rotation * Vector3.forward*10);
RaycastHit hit;
if (Physics.Raycast(ray,out hit))
{
Debug.DrawRay(weldStick.transform.position, weldStick.transform.rotation * Vector3.forward* hit.distance, Color.green);
posHit = hit.point;
objWeldArc.transform.position = posHit;
}
else
{
Debug.DrawRay(weldStick.transform.position, weldStick.transform.rotation * Vector3.forward, Color.green);
posHit = Vector3.zero;
Debug.Log(“Did not Hit”);
}
// bool b = Physics.Raycast(ray, out hit, 1 << 9);
}
}

Appendix L

Program for drawing the weld joint:
using System.Collections.Generic;
using UnityEngine;
public class Mesh04: MonoBehaviour
{
public Vector3 GridSize = new Vector3(3, 3, 3);
public float SurfaceLevel = 0.5f;
public LineRenderer line = null;
public Material material = null;
private GridPoint[,,] p = null;
private List<Vector3> vertices = new List<Vector3>();
private List<int> triangles = new List<int>();
private List<Vector2> uv = new List<Vector2>();
private GridCell cell = new GridCell();
private bool bRequestBuild = false;
public bool showGrid = true;
private void Start()
{
p = new GridPoint[(int) GridSize.x + 1, (int) GridSize.y + 1, (int) GridSize.z + 1];
InitGrid();
}
private void Update()
{
if (Input.GetKeyDown(KeyCode.Space) == true)
{
BuildMesh();
}
if (bRequestBuild == true)
{
BuildMesh();
bRequestBuild = false;
}
}
private void OnEnable()
{
GridPoint.OnPointValueChange += OnPointValueChange;
}
private void OnDisable()
{
GridPoint.OnPointValueChange -= OnPointValueChange;
}
private void InitGrid()
{
//apply noise to points
float nx = 0;
float ny = 0;
float nz = 0;
for (float z = 0; z <= GridSize.z; z++)
{
for (float y = 0; y <= GridSize.y; y++)
{
for (float x = 0; x <= GridSize.x; x++)
{
GameObject go = GameObject.CreatePrimitive(PrimitiveType.Cube);
if(showGrid)
{
go.GetComponent<MeshRenderer>().enabled = true;
}
else
{
go.GetComponent<MeshRenderer>().enabled = false;
}
go.transform.parent = this.transform;
go.GetComponent<Collider>().isTrigger = true;
Rigidbody rb = go.gameObject.AddComponent<Rigidbody>();
rb.useGravity = false;
GridPoint gp = go.gameObject.AddComponent<GridPoint>();
gp.Position = new Vector3(x, y, z);
gp.Size = 0.1f;
gp.Value = SurfaceLevel + 0.05f;
p[(int)x, (int)y, (int)z] = gp;
}
}
}
}
public void DrawLines()
{
float X = GridSize.x;
float Y = GridSize.y;
float Z = GridSize.z;
line.positionCount = 16;
line.SetPosition(0, new Vector3(0, 0, 0)); //C
line.SetPosition(1, new Vector3(0, Y, 0)); //A
line.SetPosition(2, new Vector3(X, Y, 0)); //B
line.SetPosition(3, new Vector3(X, 0, 0)); //D
line.SetPosition(4, new Vector3(0, 0, 0)); → //C
line.SetPosition(5, new Vector3(0, 0, Z)); //G
line.SetPosition(6, new Vector3(0, Y, Z)); //E
line.SetPosition(7, new Vector3(X, Y, Z)); //F
line.SetPosition(8, new Vector3(X, 0, Z)); //H
line.SetPosition(9, new Vector3(0, 0, Z)); → //G
line.SetPosition(10, new Vector3(0, Y, Z)); //E
line.SetPosition(11, new Vector3(0, Y, 0)); //A
line.SetPosition(12, new Vector3(X, Y, 0)); //B
line.SetPosition(13, new Vector3(X, Y, Z)); //F
line.SetPosition(14, new Vector3(X, 0, Z)); //H
line.SetPosition(15, new Vector3(X, 0, 0)); //D
}
private void BuildMesh()
{
float mark = Time.time;
GameObject go = this.gameObject;
MarchingCube.GetMesh(ref go, ref material, false);
 
/* vertex 8 (0-7)
E4-------------F5 7654-3210
| | HGFE-DCBA
| |
H7-------------G6 |
| | | |
| | | |
| A0--------|----B1
| |
| |
D3-------------C2 */
 
vertices.Clear();
triangles.Clear();
uv.Clear();
for (int z = 0; z < GridSize.z; z++)
{
for (int y = 0; y < GridSize.y; y++)
{
for (int x = 0; x < GridSize.x; x++)
{
cell.p[0] = p[x, y, z + 1]; //A0
cell.p[1] = p[x + 1, y, z + 1]; //B1
cell.p[2] = p[x + 1, y, z]; //C2
cell.p[3] = p[x, y, z]; //D3
cell.p[4] = p[x, y + 1, z + 1]; //E4
cell.p[5] = p[x + 1, y + 1, z + 1]; //F5
cell.p[6] = p[x + 1, y + 1, z]; //G6
cell.p[7] = p[x, y + 1, z]; //H7
MarchingCube.IsoFaces(ref cell, SurfaceLevel);
BuildCellMeshData(ref cell);
}
}
}
Vector3[] av = vertices.ToArray();
int[] at = triangles.ToArray();
Vector2[] au = uv.ToArray();
MarchingCube.SetMesh(ref go, ref av, ref at, ref au);
float timeTaken = mark - Time.time;
if (Mathf.Approximately(timeTaken, 0) == false)
{
Debug.Log(string.Format(“mesh time = {0}”, timeTaken));
}
}
private void BuildCellMeshData(ref GridCell cell)
{
bool uvAlternate = false;
for (int i = 0; i < cell.numtriangles; i++)
{
vertices.Add(cell.triangle[i].p[0]);
vertices.Add(cell.triangle[i].p[1]);
vertices.Add(cell.triangle[i].p[2]);
triangles.Add(vertices.Count - 1);
triangles.Add(vertices.Count - 2);
triangles.Add(vertices.Count - 3);
if (uvAlternate == true)
{
uv.Add(UVCoord.A);
uv.Add(UVCoord.C);
uv.Add(UVCoord.D);
}
else
{
uv.Add(UVCoord.A);
uv.Add(UVCoord.B);
uv.Add(UVCoord.C);
}
uvAlternate = !uvAlternate;
}
}
private void OnPointValueChange(ref GridPoint gp)
{
print(“change”);
bRequestBuild = true;
}
}.
Applsci 13 06082 i005

References

  1. Musawel, R.K. Design and Construction of Virtual Welding Training System for the Shielded Metal Arc Welding (SMAW). Ph.D. Thesis, University of Basrah, Basra, Iraq, 2013. [Google Scholar]
  2. Yap, H.J.; Taha, Z.; Choo, H.K.; Kok, C.K. Virtual reality-based training system for metal active gas welding. In The Thousand Faces of Virtual Reality; Lanyi, C.S., Ed.; InTech: London, UK, 2014; Chapter 5; pp. 87–104. [Google Scholar]
  3. Fletcher, C.; Ritchie, J.M.; Lim, T. Virtual machining and expert knowledge capture. In Proceedings of the Digital Engagement 2011, Newcastle, UK, 15–17 November 2011. [Google Scholar]
  4. Yunus, F.A.N.; Baser, J.A.; Masran, S.H.; Razali, N.; Rahim, B. Virtual reality simulator developed welding technology skills. J. Mod. Educ. Rev. 2011, 1, 57–62. [Google Scholar]
  5. Gutierrez, M.A.A.; Vexo, F.; Thalmann, D. Stepping into Virtual Reality; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
  6. Craig, A.B.; Sherman, W.R.; Will, J.D. Developing Virtual Reality Applications; Elsevier: Amsterdam, The Netherlands, 2009. [Google Scholar]
  7. Moraes, R.M.; Machado, L.S. Development of a medical training system with integration of users’ skills assessment. In Virtual Reality; Kim, J.J., Ed.; InTech: London, UK, 2017. [Google Scholar]
  8. Saggio, G.; Ferrari, M. New Trends in Virtual Reality Visualization of 3D Scenarios. In Virtual Reality—Human Computer Interaction; Tang, X.X., Ed.; InTech: London, UK, 2012. [Google Scholar]
  9. Kim, G.J. Designing Virtual Reality Systems; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
  10. Kobayashi, K.; Ishigame, S.; Kato, H. Simulator of manual metal arc welding with haptic display. In Proceedings of the ICAT, Tokyo, Japan, 5–7 December 2001. [Google Scholar]
  11. Xie, B.; Zhou, Q.; Yu, L. A real-time welding training system base on virtual reality. In Proceedings of the IEEE Virtual Reality Conference, Arles, France, 23–27 March 2015. [Google Scholar]
  12. Reisgen, U.; Schleser, M.; Mokrov, O.; Zabirov, A. Virtual welding equipment for simulation of GMAW processes with integration of power source regulation. Front. Mater. Sci. 2011, 5, 79–89. [Google Scholar] [CrossRef]
  13. White, S.A.; Prachyabrued, M.; Chambers, T.L.; Borst, C.W.; Reiners, D. Low-cost simulated MIG welding for advancement in technical training. Virtual Real. 2011, 15, 69–81. [Google Scholar] [CrossRef]
  14. Chambers, T.L.; Aglawe, A.; Reiners, D.; White, S.; Borst, C.; Prachyabrued, M.; Bajpayee, A. Real-time simulation for a virtual reality based MIG welding training system. Virtual Real. 2012, 16, 45–55. [Google Scholar] [CrossRef]
  15. Fast, K.; Gifford, T.; Yancy, R. Virtual training for welding. In Proceedings of the International Symposium on Mixed and Augmented Reality, Arlington, VA, USA, 5 November 2004. [Google Scholar]
  16. Price, A.H.; Kuttolamadom, M.; Obeidat, S. Using virtual reality welding to improve manufacturing process education. In Proceedings of the 2019 Conference for Industry and Education Collaboration, New Orleans, LA, USA, 30 January–1 February 2019. [Google Scholar]
  17. Rusli, F.N.; Zulkifli, A.; bin Saad, M.; Yussop, Y.M. A study of students’ motivation in using the mobile arc welding learning app. Int. J. Interact. Mob. Technol. 2019, 13, 89–104. [Google Scholar] [CrossRef]
  18. Wang, Y.; Chen, Y.; Nan, Z.; Hu, Y. Study on welder training by means of haptic guidance and virtual reality for arc welding. In Proceedings of the IEEE International Conference on Robotics and Biomimetics, Kunming, China, 17–20 December 2006. [Google Scholar]
  19. Kunwoo, L. Principle of CAD/CAM/CAE Systems; Addison Wesley-Longman: Boston, MA, USA, 1999. [Google Scholar]
  20. Choi, A.C.K.; Chan, D.S.K.; Yuen, A.M.F. Application of virtual assembly tools for improving product design. Int. J. Adv. Manuf. Technol. 2002, 19, 377–383. [Google Scholar] [CrossRef]
  21. Wang, S.; Mao, Z.; Zeng, C.; Gong, H.; Li, S.; Chen, B. A new method of virtual reality based on Unity3D. In Proceedings of the 18th International Conference on Geoinformatics, Beijing, China, 18–20 June 2010; pp. 1–5. [Google Scholar]
  22. Gigante, M.A. Virtual reality: Enabling technologies. In Virtual Reality Systems; Earnshaw, R.A., Gigante, M.A., Jones, H., Eds.; Academic Press: San Diego, CA, USA, 1993; Chapter 2; pp. 15–25. [Google Scholar]
  23. Vergara, D.; Rubio, M.P.; Lorenzo, M. On the design of virtual reality learning environments in engineering. Multimodal Technol. Interact. 2017, 1, 11. [Google Scholar] [CrossRef] [Green Version]
  24. Benson, R.A.; Krishnan, V.L.; Reddy, T.A.; Prasad, G.R.K. Virtual reality-based welding training simulator. Int. J. Control Theory Appl. 2016, 9, 1235–1243. [Google Scholar]
  25. Jo, D.; Kim, Y.; Yang, U.; Choi, J.; Kim, K.H.; Lee, G.A.; Park, Y.W. Welding representation for training under VR environments. In Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry, Hongkong, China, 11–12 December 2011. [Google Scholar]
  26. Yang, U.; Lee, G.A.; Kim, Y.; Jo, D.; Choi, J.; Kim, K.H. Virtual reality based welding training simulator with 3D multimodal interaction. In Proceedings of the 2010 International Conference on Cyberworlds, Singapore, 20–22 October 2010. [Google Scholar]
  27. Isham, M.I.M.; Mohamed, F.; Haron, H.N.H.; Siang, C.V.; Mokhtar, M.K. Welding training simulation: Combination of virtual reality and multiple marker tracking. In Proceedings of the 2nd National Symposium on Human-Computer Interaction, Virtual, 8 October 2020. [Google Scholar]
  28. Whyte, J. Virtual Reality and the Built Environment; Architectural Press: New York, NY, USA, 2002. [Google Scholar]
  29. Sherman, W.R.; Craig, A.B. Understanding Virtual Reality—Interface, Application, and Design; Elsevier Science: Amsterdam, The Netherlands, 2003. [Google Scholar]
  30. Mihelj, M.; Novak, D.; Begus, S. Virtual Reality Technology and Applications; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
  31. S-485 Serial Interface Explained. Available online: https://www.cuidevices.com/blog/rs-485-serial-interface-explained/ (accessed on 6 July 2022).
  32. Infrared Sensor—IR Sensor. Available online: https://www.infratec.eu/sensor-division/service-support/glossary/infrared-sensor/ (accessed on 6 June 2022).
  33. Knopfle, C.; Schiefele, J. Data Preparation on Polygonal Basis. In Virtual Reality for Industrial Applications; Dai, F., Ed.; Springer: Berlin/Heidelberg, Germany, 1997. [Google Scholar]
  34. Hagenmann, F.M. SIMPLIFY—Application Specific Data Preparation. In Virtual Reality for Industrial Applications; Dai, F., Ed.; Springer: Berlin/Heidelberg, Germany, 1997. [Google Scholar]
  35. What is USB (Universal Serial Bus)? Meaning, Types, and Importance. Available online: https://www.spiceworks.com/tech/tech-general/articles/universal-serial-bus/ (accessed on 6 July 2022).
  36. What is OcuWeld? Available online: https://ocuweld.com/ (accessed on 10 September 2022).
  37. Basic Arc Welding Technique. Available online: https://www.en.hongky.com/basic-arc-welding-technique (accessed on 6 June 2022).
  38. Predescu (Burciu), S.L.; Caramihai, S.I.; Moisescu, M.A. Impact of VR Application in an Academic Context. Appl. Sci. 2023, 13, 4748. [Google Scholar] [CrossRef]
  39. Bharath, V.G.; Patil, R. Virtual reality for metal arc welding: A review and design concept. Int. J. Mech. Eng. Technol. 2017, 8, 132–138. [Google Scholar]
  40. Chan, B.K.H. Predicting Weld Features Using Artificial Neural Network Technology. Ph.D. Thesis, Carleton University, Ottawa, ON, Canada, 1996. [Google Scholar]
  41. Tran, N.H.; Bui, V.H.; Hoang, V.T. Development of an artificial intelligence-based system for predicting weld bead geometry. Appl. Sci. 2023, 13, 4232. [Google Scholar] [CrossRef]
  42. Mohd, N.C.W.; Ferry, M.; Nik, W.B.W. A study of software approach for predicting weld bead geometry in shielded metal arc welding (SMAW) process. Appl. Mech. Mater. 2014, 554, 386–390. [Google Scholar]
  43. Ganjigatti, J.P.; Pratihar, D.K.; Choudhury, A.R. Modeling of the MIG welding process using statistical approaches. Int. J. Adv. Manuf. Technol. 2008, 35, 1166–1190. [Google Scholar] [CrossRef]
  44. Kumar, R.; Saurav, S.K. Modeling of TIG welding process by regression analysis and neural network technique. Int. J. Mech. Eng. Technol. 2015, 6, 10–27. [Google Scholar]
  45. Part 5 Welding. Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/896377/MGN629_Part_5_Welding_R07.20.pdf (accessed on 8 August 2022).
  46. Bowman, M.D.; Quinn, B.P. Fillet weld profile measurements. Exp. Tech. 1995, 19, 21–24. [Google Scholar] [CrossRef]
  47. Shankhwar, K.; Chuang, T.J.; Tsai, Y.Y.; Smith, S. A visuo-haptic extended reality–based training system for hands-on manual metal arc welding training. Int. J. Adv. Manuf. Technol. 2022, 121, 249–265. [Google Scholar] [CrossRef]
  48. VRTEX 360+ Dual User Virtual Reality Welding Training Simulator on Pallet. Available online: https://www.lincolnelectric.com/en/products/k4602-1 (accessed on 8 September 2022).
Figure 1. Architecture of the virtual reality-based system for simulating welding processes.
Figure 1. Architecture of the virtual reality-based system for simulating welding processes.
Applsci 13 06082 g001
Figure 2. Systematic procedure for developing the virtual reality-based system.
Figure 2. Systematic procedure for developing the virtual reality-based system.
Applsci 13 06082 g002
Figure 3. Architecture of module #1.
Figure 3. Architecture of module #1.
Applsci 13 06082 g003
Figure 4. Circuit diagram of MCU #1.
Figure 4. Circuit diagram of MCU #1.
Applsci 13 06082 g004
Figure 5. Principle for controlling the IR levels.
Figure 5. Principle for controlling the IR levels.
Applsci 13 06082 g005
Figure 6. Circuit diagram of communication block—RS485 port.
Figure 6. Circuit diagram of communication block—RS485 port.
Applsci 13 06082 g006
Figure 7. Architecture of module #2.
Figure 7. Architecture of module #2.
Applsci 13 06082 g007
Figure 8. Sensor positions and signals.
Figure 8. Sensor positions and signals.
Applsci 13 06082 g008
Figure 9. Algorithm for processing signals from sensors.
Figure 9. Algorithm for processing signals from sensors.
Applsci 13 06082 g009
Figure 10. Parameters for simulating the welding process.
Figure 10. Parameters for simulating the welding process.
Applsci 13 06082 g010
Figure 11. From 3D CAD models to virtual reality environment.
Figure 11. From 3D CAD models to virtual reality environment.
Applsci 13 06082 g011
Figure 12. Algorithm for generating the virtual reality environment.
Figure 12. Algorithm for generating the virtual reality environment.
Applsci 13 06082 g012
Figure 13. Cross-section of the weld bead.
Figure 13. Cross-section of the weld bead.
Applsci 13 06082 g013
Figure 14. Architecture of the circuit block MCU#2.
Figure 14. Architecture of the circuit block MCU#2.
Applsci 13 06082 g014
Figure 15. Architecture of the circuit board attached to IR sensors.
Figure 15. Architecture of the circuit board attached to IR sensors.
Applsci 13 06082 g015
Figure 16. The developed VR-based system.
Figure 16. The developed VR-based system.
Applsci 13 06082 g016
Figure 17. Testing the functionality of the developed system.
Figure 17. Testing the functionality of the developed system.
Applsci 13 06082 g017
Figure 18. Simulation of the TIG welding process.
Figure 18. Simulation of the TIG welding process.
Applsci 13 06082 g018
Figure 19. Dimensions of the plates for simulating the fillet weld joint.
Figure 19. Dimensions of the plates for simulating the fillet weld joint.
Applsci 13 06082 g019
Table 1. Welding parameters for SMAW process [42].
Table 1. Welding parameters for SMAW process [42].
ParametersUnitsNotationsMinimum ValueMaximum Value
Welding currentAA6090
Arc lengthmmB1.23.0
Welding speedmm/minC70120
Electrode diametermmD2.64.0
Joint gapmmE1.03.0
Table 2. Welding parameters for MIG process [43].
Table 2. Welding parameters for MIG process [43].
ParametersUnitsNotationsMinimum ValueMaximum Value
Welding speedcm/minp2545
Arc voltageVm2630
Wire feed ratem/minf67
Gas flow ratel/mind1418
Nozzle to plate distancemme1520
Torch angledegreen70100
Table 3. Welding parameters for TIG process [44].
Table 3. Welding parameters for TIG process [44].
ParametersUnitsNotationsMinimum ValueMaximum Value
Welding speedcm/minM2446
Wire feed ratecm/minN1.52.5
% cleaning O3070
Joint gapmmP2.43.2
Welding currentAQ80110
Table 4. Comparison of the required results and simulated results.
Table 4. Comparison of the required results and simulated results.
Required Results [45,46]Simulated Results
Acceptable Fillet Weld ProfilesApplsci 13 06082 i001Applsci 13 06082 i002
Plate Thickness (mm)Weld Size L (mm)Plate Thickness (mm)Weld Size L (mm)
33.533.8
44.044.3
54.554.8
65.065.2
86.086.4
107.0107.3
Table 5. Comparison of the developed system with the reported systems and the commercial systems.
Table 5. Comparison of the developed system with the reported systems and the commercial systems.
VR for SMAW [1]VR for MIG [13,14]VR for Metal Arc Welding [47]Onew360 [11]VRTEX 360 [16,48]OcuWeld [36]Our Developed System
Welding processes can be simulatedSMAWMIGSMAW (or MMAW)MIG (or GMAW)SMAW, MIG, TIGSMAW, MIG, TIGSMAW, MIG, TIG
Tracking the welding torch positionLinear Variable Differential Transformer (LVDT) sensorCameraVive trackers, distance sensor (VL6180X)Camera, inertial sensorsCameraCameraInfrared radiation sensors, and LED infrared light
Programming virtual reality moduleVB.net 2010 language from MicrosoftNot mentionedVive input Utility and SRWorks software development kitNot mentionedNot mentionedNot mentionedPsoCreator, Unity3D, and C# language
Visualization in 3D and processing of user interactions in real timeYESYESYESYESYESYESYES
Electrode is stuck to the welded platesNot mentionedNot mentionedNot mentionedYESYESYESNO
Helmet, 3D glassesHelmetHelmetHelmetHelmetHelmet3D glasses3D glasses
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tran, N.-H.; Nguyen, V.-N.; Bui, V.-H. Development of a Virtual Reality-Based System for Simulating Welding Processes. Appl. Sci. 2023, 13, 6082. https://0-doi-org.brum.beds.ac.uk/10.3390/app13106082

AMA Style

Tran N-H, Nguyen V-N, Bui V-H. Development of a Virtual Reality-Based System for Simulating Welding Processes. Applied Sciences. 2023; 13(10):6082. https://0-doi-org.brum.beds.ac.uk/10.3390/app13106082

Chicago/Turabian Style

Tran, Ngoc-Hien, Van-Nghia Nguyen, and Van-Hung Bui. 2023. "Development of a Virtual Reality-Based System for Simulating Welding Processes" Applied Sciences 13, no. 10: 6082. https://0-doi-org.brum.beds.ac.uk/10.3390/app13106082

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop