US 12,450,153 B2
Automated testing of mobile devices using visual analysis
Dong Chen, Lansing, MI (US); Tor Fredericks, Bellevue, WA (US); Anqi Luo, Redmond, WA (US); and Pei Zheng, Sammamish, WA (US)
Assigned to T-Mobile USA, Inc., Bellevue, WA (US)
Filed by T-Mobile USA, Inc., Bellevue, WA (US)
Filed on May 28, 2024, as Appl. No. 18/676,131.
Application 18/676,131 is a continuation of application No. 17/094,741, filed on Nov. 10, 2020, granted, now 12,026,084.
Prior Publication US 2024/0311277 A1, Sep. 19, 2024
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 11/3698 (2025.01); G06F 11/3668 (2025.01); G06N 20/00 (2019.01); G06F 3/04817 (2022.01)
CPC G06F 11/3698 (2025.01) [G06F 11/3688 (2013.01); G06N 20/00 (2019.01); G06F 3/04817 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
training a machine learning model based on a set of interaction data associated with a user goal, the interaction data including at least locations of touch selections and screenshot images, wherein the training includes removing outlier interactions from the set of interaction data, the outlier interactions including extra user selections or user selections for which coordinates are greater than a threshold distance from average coordinates for other user selections;
receiving a request to complete a testing routine on a mobile device;
until determining that the testing routine is complete, repeating a process that comprises:
obtaining a current snapshot of a screen of the mobile device;
determining a testing action for the mobile device based on the current snapshot, wherein determining the testing action for the mobile device based on the current snapshot comprises providing the current snapshot to the machine learning model; and
executing the testing action for the mobile device; and
responding to the request with whether the testing routine is a failure or success based upon the current snapshot.