US 12,254,696 B2
Item identification and tracking system
Shaked Dolev, Tel Aviv (IL); Yuval Snappir, Tel Aviv (IL); and Daniel Gabay, Even Yehuda (IL)
Assigned to TRIGO VISION LTD., Tel Aviv (IL)
Appl. No. 17/769,908
Filed by TRIGO VISION LTD., Tel Aviv (IL)
PCT Filed Nov. 19, 2020, PCT No. PCT/IL2020/051198
§ 371(c)(1), (2) Date Apr. 18, 2022,
PCT Pub. No. WO2021/100043, PCT Pub. Date May 27, 2021.
Claims priority of provisional application 62/938,563, filed on Nov. 21, 2019.
Claims priority of provisional application 62/938,681, filed on Nov. 21, 2019.
Prior Publication US 2022/0366578 A1, Nov. 17, 2022
Int. Cl. G06V 20/52 (2022.01); G06T 7/292 (2017.01); G06T 7/73 (2017.01); G06V 10/22 (2022.01)
CPC G06V 20/52 (2022.01) [G06T 7/292 (2017.01); G06T 7/73 (2017.01); G06V 10/22 (2022.01); G06T 2207/20081 (2013.01)] 14 Claims
OG exemplary drawing
 
1. A method for acquiring data relating to an object comprising:
arranging a multiplicity of cameras to view a scene, at least one reference object within said scene being viewable by at least a plurality of said multiplicity of cameras, each of said plurality of cameras acquiring at least one image of said reference object viewable thereby;
finding a point of intersection of light rays illuminating each of said plurality of cameras;
correlating a pixel location at which said reference object appears within each said at least one image to said light rays illuminating each of said plurality of cameras and intersecting with said point of intersection, irrespective of a three-dimensional location of said reference object within said scene, said correlating comprising deriving ray parameters of said light rays illuminating each of said plurality of cameras and corresponding to said pixel location at which said reference object appears within each said at least one image;
ascertaining whether said light rays having said derived ray parameters intersect with said point of intersection with greater than or equal to a predetermined accuracy;
following said ascertaining that said light rays having said derived ray parameters intersect with said point of intersection with greater than or equal to said predetermined accuracy, relocating said reference object within said scene, so as to be viewable by a different plurality of cameras of said multiplicity of cameras than said plurality of cameras by which said reference object was previously viewable prior to said relocation thereof;
finding a point of intersection of light rays illuminating each of said different plurality of cameras; and
correlating a pixel location at which said reference object appears within each said at least one image to said light rays illuminating each of said different plurality of cameras and intersecting with said region of intersection, irrespective of a three-dimensional location of said reference object within said scene.