US 11,849,914 B2
Endoscopic image processing method and system, and computer device
Xinghui Fu, Shenzhen (CN); Zhongqian Sun, Shenzhen (CN); and Wei Yang, Shenzhen (CN)
Assigned to TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED, Shenzhen (CN)
Filed by TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED, Guangdong (CN)
Filed on Oct. 23, 2020, as Appl. No. 17/078,826.
Application 17/078,826 is a continuation of application No. PCT/CN2019/112202, filed on Oct. 21, 2019.
Claims priority of application No. 201811276885.2 (CN), filed on Oct. 30, 2018.
Prior Publication US 2021/0052135 A1, Feb. 25, 2021
Int. Cl. A61B 1/00 (2006.01); G06N 3/045 (2023.01)
CPC A61B 1/0005 (2013.01) [A61B 1/000094 (2022.02); A61B 1/000096 (2022.02); G06N 3/045 (2023.01)] 18 Claims
OG exemplary drawing
 
1. An endoscopic image processing method, comprising:
receiving at least one first endoscopic image specific to a human body part;
creating a deep convolutional network for endoscopic image prediction;
determining a training parameter of the deep convolutional network according to both (i) the at least one first endoscopic image and (ii) at least one second endoscopic image transformed from the at least one first endoscopic image, the determining the training parameter includes iteratively adjusting the training parameter according to a loss function of the deep convolutional network, a value of the loss function being determined based on a feature of the at least one first endoscopic image and a feature of the at least one second endoscopic image;
receiving a current endoscopic image of a to-be-examined user;
predicting the current endoscopic image by using the deep convolutional network based on the training parameter that is determined according to the at least one first endoscopic image and the at least one second endoscopic image transformed from the at least one first endoscopic image, where the at least one first endoscopic image corresponds to the human body part; and
determining an organ category corresponding to the current endoscopic image.