In this study, researchers aimed to develop a novel method for measuring the position and orientation of a tool in real-time during robotic welding tasks. The proposed method utilizes a combination of technologies, including a Robotic Functional System (RFS), encoders, and a base point in the camera frame.
To begin with, the RFS unit is designed to provide continuous information on the position and orientation of the tool. This information is then combined with data from the encoders, which measure the position and orientation of the robotic arm. The base point in the camera frame serves as a reference point for the transformation process, allowing researchers to align the RFS unit’s measurements with the camera frame.
The study involves five image frames captured using a high-speed camera, each frame containing information on the positions of distinct sections of the tool and the base point. By analyzing these images, researchers can determine the position and orientation of the tool in real-time, ensuring that it is aligned with the welding process.
To streamline the process, a brief stabilizing period is introduced after each incremental step, allowing researchers to capture and average the data to eliminate noise and enhance data accuracy. The resulting dataset consists of 15 CSV files, each containing 8100 data entries, which are then used to train a deep learning model for improved performance.
Overall, this novel method has the potential to revolutionize robotic welding by providing real-time feedback on tool position and orientation, leading to increased accuracy and efficiency in the welding process. By leveraging cutting-edge technologies and innovative approaches, researchers can improve the quality and speed of robotic welding, paving the way for more efficient and cost-effective manufacturing processes.