US 12,449,912 B2
Touchless user-interface control method including fading
Martin Seiler, Ehrenkirchen (DE)
Assigned to AMERIA AG, Heidelberg (DE)
Appl. No. 19/109,611
Filed by AMERIA AG, Heidelberg (DE)
PCT Filed Sep. 14, 2023, PCT No. PCT/EP2023/075332
§ 371(c)(1), (2) Date Mar. 7, 2025,
PCT Pub. No. WO2024/061742, PCT Pub. Date Mar. 28, 2024.
Claims priority of application No. 22196433 (EP), filed on Sep. 19, 2022.
Prior Publication US 2025/0258553 A1, Aug. 14, 2025
Int. Cl. G06F 3/03 (2006.01); G06F 3/01 (2006.01)
CPC G06F 3/0304 (2013.01) [G06F 3/017 (2013.01)] 11 Claims
OG exemplary drawing
 
1. A computer-implemented touchless user-interface control method for an electronic display (100), comprising:
detecting, using at least a first and a second depth camera (110, 120), an input object (200) and determining a set of 3D-points (210) corresponding to the input object (200);
wherein the set of 3D-points (210) includes a first subset of 3D-points (211) which is based on data captured by the first depth camera (110), and a second subset of 3D-points (212) which is based on data captured by the second depth camera (120);
performing 3D-point-fading (113, 123), wherein weights are assigned to the 3D-points, wherein 3D-points of the first subset (211) are weighted depending on their positions relative to the first depth camera (110), and 3D-points of the second subset (212) are weighted depending on their positions relative to the second depth camera (120), wherein the method further comprises:
defining, for the first depth camera (110), a first spatial input area (111) for the recognition of touchless input, the first spatial input area (111) having a first spatial boundary (112);
defining, for the second depth camera (120), a second spatial input area (121) for the recognition of touchless input, the second spatial input (121) area having a second spatial boundary (122);
wherein 3D-point-fading (113, 123) is performed within the respective spatial input area (111, 121);
wherein the method further comprises:
defining, inside each of the spatial input areas (111, 121), a spatial fading layer spatially located adjacent to the respective spatial boundary (112, 122), wherein the spatial fading layers each have a spatial extension;
wherein 3D-point-fading (113, 123) is performed for 3D-points which are positioned within a respective spatial fading layer;
wherein the spatial extension of the spatial fading layers is adjustable.