US 11,937,771 B2
Articulated structured light based-laparoscope
Tal Nir, Haifa (IL); Motti Frimer, Zichron Yaakov (IL); and Gal Atarot, Kfar Saba (IL)
Assigned to Asensus Surgical Europe S.à.R.L., Lugano (CH)
Filed by Asensus Surgical US, Inc., Durham, NC (US)
Filed on Sep. 14, 2021, as Appl. No. 17/475,198.
Application 17/475,198 is a continuation of application No. 15/129,925, granted, now 11,116,383, issued on Sep. 14, 2021, previously published as PCT/IL2015/050349, filed on Mar. 31, 2015.
Claims priority of provisional application 62/130,641, filed on Mar. 10, 2015.
Claims priority of provisional application 61/973,899, filed on Apr. 2, 2014.
Prior Publication US 2022/0394161 A1, Dec. 8, 2022
Int. Cl. A61B 1/00 (2006.01); A61B 1/005 (2006.01); A61B 1/008 (2006.01); A61B 1/04 (2006.01); A61B 1/05 (2006.01); A61B 1/06 (2006.01); A61B 1/313 (2006.01); A61B 18/22 (2006.01); A61B 34/00 (2016.01); A61B 34/20 (2016.01); A61B 34/30 (2016.01); A61B 90/00 (2016.01); G06T 7/33 (2017.01); G06T 15/20 (2011.01); H04N 23/50 (2023.01); H04N 23/56 (2023.01)
CPC A61B 1/00042 (2022.02) [A61B 1/00006 (2013.01); A61B 1/000096 (2022.02); A61B 1/00045 (2013.01); A61B 1/00194 (2022.02); A61B 1/008 (2013.01); A61B 1/009 (2022.02); A61B 1/042 (2013.01); A61B 1/05 (2013.01); A61B 1/0605 (2022.02); A61B 1/0676 (2013.01); A61B 1/313 (2013.01); A61B 1/3132 (2013.01); A61B 18/22 (2013.01); A61B 34/20 (2016.02); G06T 7/337 (2017.01); G06T 15/205 (2013.01); H04N 23/56 (2023.01); A61B 1/00 (2013.01); A61B 2034/2048 (2016.02); A61B 2034/2065 (2016.02); A61B 2034/301 (2016.02); A61B 2034/742 (2016.02); A61B 2090/367 (2016.02); A61B 2090/373 (2016.02); G06T 2207/10021 (2013.01); G06T 2207/10081 (2013.01); G06T 2207/10104 (2013.01); G06T 2207/10136 (2013.01); H04N 23/555 (2023.01)] 12 Claims
OG exemplary drawing
 
1. A method of using a structured-light based endoscope, comprising:
capturing at least one real-time 2D image of at least a portion of said field of view using a camera of an endoscope;
illuminating in real time at least a portion of said at least one object within at least a portion of said field of view with a structured light pattern;
detecting light reflected from said field of view;
from said light reflected from said field of view, generating said 3D image of said field of view and calculating 3D locations of points on a surface of said object, wherein said 3D image is constructable from said detected light reflected from said field of view and said structured light pattern; and
real-time locating the 3D spatial position at any given time t of at least one of the endoscope and a surgical tool;
causing the system to generate a user alert if a distance between the surface and said at least one of the endoscope and surgical tool, as determined using the 3D spatial position, falls below a predetermined distance;
wherein the method further comprises constructing said 3D image by calculating world coordinates of at least one point on said at least one object using the following equation:

OG Complex Work Unit Math
where nT is the transpose of the normal to the plane defined by the stripe ID xp, custom characterp=xp+[δxp, 0, fp]T is the perturbed stripe ID xp, Rp is the rotation matrix defining the transformation between the world coordinate system and the projector coordinate system and vc is the direction of the ray between the stripe ID and the object point.