US 12,083,619 B2
Method for selection of camera image sections
Eric Kallenbach, Kleinmachnow (DE); Sven Füßler, Kleinmachnow (DE); Sergej Scharich, Kleinmachnow (DE); and Jeroen Jonkers, Kleinmachnow (DE)
Assigned to II-VI DELAWARE, INC., Wilmington, DE (US)
Filed by II-VI Delaware, Inc., Wilmington, DE (US)
Filed on Oct. 29, 2019, as Appl. No. 16/666,827.
Claims priority of application No. 10 2019 101 222.8 (DE), filed on Jan. 17, 2019.
Prior Publication US 2020/0230737 A1, Jul. 23, 2020
Int. Cl. B23K 26/03 (2006.01); B23K 26/044 (2014.01)
CPC B23K 26/032 (2013.01) [B23K 26/034 (2013.01); B23K 26/0344 (2015.10); B23K 26/044 (2015.10)] 20 Claims
OG exemplary drawing
 
1. A method of monitoring laser material processing of at least one workpiece performed by a laser material processing head, the method comprising the steps of:
recording a real-time image using a camera sensor of a camera associated with the laser material processing head, the real-time image comprising a spatial area of the at least one workpiece surrounding a process point used in the laser material processing of the at least one workpiece performed by the laser material processing head;
selecting at least one image section within the real-time image, the at least one image section having a location area that is smaller on the camera sensor of the camera compared to the spatial area of the real-time image, the selecting at least one image section includes selection of:
i. a current position of the process point within the real-time image at a current time (T1); and
ii. a desired position for the process point at a future time (T2) based on a projection of programmed path data of the laser material processing head, the desired position having been identified after identification of the process point in the real-time image, and the programmed path data being configured to control the laser material processing head;
calculating a deviation between an actual position of the process point at the future time (T2) and the desired position of the process point at the future time (T2);
predicting, at the current position of the process point at the current time (T1), at least one future image section for selection within the real-time image at a further future time (T3) after the selection of the desired position at the future time (T2) based on the calculated deviation and the programmed path data; and
selecting the at least one future image section within the real-time image at the further future time (T3).