US 12,268,357 B2
Target part identification among human-body internal images
Masahiro Saikou, Tokyo (JP)
Assigned to NEC CORPORATION, Tokyo (JP)
Filed by NEC Corporation, Tokyo (JP)
Filed on Mar. 12, 2024, as Appl. No. 18/602,272.
Application 18/602,272 is a continuation of application No. 17/258,296, granted, now 11,969,143, previously published as PCT/JP2018/025871, filed on Jul. 9, 2018.
Prior Publication US 2024/0206701 A1, Jun. 27, 2024
This patent is subject to a terminal disclaimer.
Int. Cl. A61B 1/00 (2006.01); A61B 5/00 (2006.01); A61B 34/10 (2016.01); G06T 7/73 (2017.01)
CPC A61B 1/000094 (2022.02) [A61B 1/000096 (2022.02); A61B 5/4887 (2013.01); A61B 5/7221 (2013.01); A61B 34/10 (2016.02); G06T 7/73 (2017.01); A61B 2034/107 (2016.02); G06T 2207/10048 (2013.01); G06T 2207/10068 (2013.01); G06T 2207/30096 (2013.01)] 13 Claims
OG exemplary drawing
 
8. A surgery assistance method comprising:
calculating, from a human-body internal image captured using an endoscope, feature extraction information and confidence information of a target part image corresponding to a target part, the feature extraction information indicating features of the target part image extracted from the human-body internal image, the confidence information indicating probability of the image in the window region being an image corresponding to the target part;
calculating a similarity degree of the confidence information between different human-body internal images, including the human-body image;
calculating, based on determining that the similarity degree of the confidence information is greater than or equal to a predetermined confidence value, a similarity degree of the feature extraction information between the different human-body internal images; and
identifying the target part image in each of the different human-body internal images if the similarity degree of the feature extraction information is greater than or equal to a predetermined feature extraction value.