Because the calibration accuracy of an industrial robot is affected by its pose-set
a simulation data-driven pose-set optimization method is proposed. Firstly
the virtual model of the industrial robot is obtained by calibration and offline compensation
and the data set consisting of the pose-set and calibration accuracy is measured and calibrated in the simulation space. Secondly
the features of the data set are extracted
the correlation between the feature factors and the recognition accuracy is calculated by using the grayscale correlation algorithm
the accuracy of the feature factors is verified
the dimensionality of the data is reduced. Finally
the SVR (support vector regression) prediction model is applied to predict the calibration accuracy of multiple pose-sets collected during robot calibration
and the poses with high calibration accuracy are selected to improve the calibration accuracy. The experimental results demonstrate that the use of the simulation data-driven method improves the average calibration accuracy by 22.9% in selecting preferred pose-sets than randomly selected pose-sets.