US 11,928,265 B1
Finger tracking to write on or correct mistakes on physical documents
Jennifer Sanctis, Charlotte, NC (US); Mary Bangs, New York, NY (US); Veronica Andrea Cadavid, New York, NY (US); Taylor Farris, Hoboken, NJ (US); Trish Gillis, Chicago, IL (US); Jesse James Godley, Charlotte, NC (US); Brian Meyers, New York, NY (US); and Vishwas Korde, Matthews, NC (US)
Assigned to Bank of America Corporation, Charlotte, NC (US)
Filed by Bank of America Corporation, Charlotte, NC (US)
Filed on May 23, 2023, as Appl. No. 18/200,903.
Int. Cl. G06F 3/01 (2006.01); G02B 27/01 (2006.01); G06T 7/70 (2017.01); G06T 19/00 (2011.01); H04B 10/116 (2013.01)
CPC G06F 3/017 (2013.01) [G02B 27/017 (2013.01); G06T 7/70 (2017.01); G06T 19/006 (2013.01); H04B 10/116 (2013.01); G02B 2027/0178 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A smart lens configured for capturing data from a first user interface (“UI”) and transmitting the data to a second UI, the transmitting leveraging light fidelity (“Lifi”), the smart lens located on a user, the smart lens comprising:
a micro camera operable to capture data from the first UI and from the second UI;
a memory for storing the data captured from the first UI and from the second UI;
a sensor configured to capture the user's one or more fingers' movements;
a loop antenna configured to enable radio frequency communication;
a light emitting diode (“LED”) attached to a substrate on the smart lens, the LED connected to a microcontroller, the microcontroller operable to move the LED, and the LED operable to transmit the data captured at the first UI to a second terminal supporting the second UI;
a microprocessor operable to capture, store, and transmit data to a Lifi receiver at a first terminal and a Lifi receiver at the second terminal, the first terminal supporting the first UI;
wherein, when the sensor detects a point of movement of the user's finger on the first UI, the microprocessor is operable to:
execute a finger movement tracking application configured to:
determine the point of movement of the user's one or more fingers on the first UI, the point of movement directed to a data entry field associated with a data entry field identifier within the first UI;
detect a deliberate finger gesture while gazing at the point of movement; and
in response to the detection, identify a data segment at the point of movement within the data entry field;
execute a data capturing application configured to:
capture the data segment and the associated data entry field identifier using the micro camera; and
store the data segment and the associated data entry field identifier in the memory of the smart lens; and
in response to a detection of the point of movement of the user's one or more fingers on the second UI:
the finger movement tracking application is further configured to detect a deliberate finger gesture while gazing at the point of movement;
the LED is configured to transmit a data packet compiled at the smart lens to the second terminal, the data packet including the data segment, the associated data entry field identifier from the smart lens and an instruction to update the second UI to incorporate the data segment at the point of movement of the user's one or more fingers on the second UI;
the Lifi receiver at the second terminal is configured to receive the data packet; and
a processor at the second terminal configured to update the second UI by inputting the data segment at the point of movement at the second UI.