CPC A61B 5/0071 (2013.01) [A61B 5/0261 (2013.01); A61B 5/20 (2013.01); A61B 5/489 (2013.01); A61B 5/4893 (2013.01); A61B 5/7203 (2013.01); A61B 5/725 (2013.01); H04N 13/257 (2018.05); H04N 23/72 (2023.01); H04N 25/131 (2023.01); A61B 2560/0223 (2013.01); A61B 2576/00 (2013.01)] | 20 Claims |
1. A system for providing visualization in a light deficient environment, the system comprising:
an emitter that emits a plurality of pulses of electromagnetic radiation according to a pulse cycle, wherein the emitter comprises a plurality of sources of electromagnetic radiation that are independently actuatable, and wherein the plurality of sources comprises:
a visible source that pulses electromagnetic radiation within a visible waveband of the electromagnetic spectrum;
a multispectral source that pulses a multispectral emission of electromagnetic radiation that elicits a spectral response from a tissue; and
a fluorescence source that pulses a fluorescence excitation emission of electromagnetic radiation;
an image sensor comprising a pixel array that senses reflected electromagnetic radiation resulting from the plurality of pulses of electromagnetic radiation to generate a plurality of exposure frames; and
a controller in communication with the emitter and the image sensor that synchronizes operations of the emitter and the image sensor, wherein the controller comprises one or more processors configured to execute instructions comprising:
instructing the emitter to cycle two or more sources of the plurality of sources to generate an output video stream rendered based on color visualization data and one or more of multispectral visualization data or fluorescence visualization data; and
determining the pulse cycle for cycling the two or more sources based on a desired frame rate of the image sensor;
wherein the output video stream further comprises dimensional information that is calculated based on data output by the image sensor, wherein the dimensional information comprises an indication of a distance between a tool and an object within the light deficient environment; and
wherein an edge enhancement algorithm is applied to edges within an exposure frame of the plurality of exposure frames to render the output video stream with increased edge differentiation.
|