CPC G06Q 20/20 (2013.01) [G06N 20/00 (2019.01); G06Q 20/40 (2013.01); G06Q 20/407 (2013.01)] | 11 Claims |
1. A method comprising:
receiving, by at least one processor, an electronic activity verification from an entity;
wherein the electronic activity verification is associated with an electronic activity of a user account verified by an entity;
wherein the electronic activity verification comprises a verified value indicative of a value associated with the electronic activity;
accessing, by the at least one processor, an electronic activity history, the electronic activity history comprising:
a plurality of electronic activity requests indicative of a plurality of initializations of electronic activities pending approval by the user, and
a plurality of electronic activity verifications indicative of a plurality of user verifications of the electronic activities upon approval by the user of the plurality of electronic activity requests;
filtering, by the at least one processor, the plurality of electronic activity requests in the electronic activity history by a predetermined time period associated with the electronic activity verification to identify at least one candidate electronic activity request;
utilizing, by the at least one processor, at least one similarity measure to determine a similarity between the at least one candidate electronic activity request and the electronic activity verification;
determining, by the at least one processor, a particular electronic activity request of the at least one candidate electronic activity request based at least in part on the at least one similarity measure and a similarity threshold value;
determining, by the at least one processor, a user-specified value indicative of an additional value specified by a user for the electronic activity based at least in part on a difference between the particular electronic activity request and the electronic activity verification;
encoding, by the at least one processor, the verified value and the user-specified value into a feature vector;
inputting, by the at least one processor, the feature vector into an anomalous attribute classification machine learning model to output an anomaly classification, wherein the anomalous attribute classification machine learning model comprises a plurality of trained machine learning model parameters trained to predict a likelihood of user dispute of the verified electronic activity based on training data comprising:
a history of verified electronic activities and
a history of disputed electronic activities;
wherein the anomaly classification comprises one of:
i) an anomalous user-specified value classification, or
ii) a non-anomalous user-specified value classification;
generating, by the at least one processor, a dispute graphical user interface (GUI) comprising an alert message and a dispute interface element;
wherein the alert message represents the anomalous user-specified value classification of an incorrect user-specified value of the electronic activity verification;
wherein the dispute interface element comprises a user selectable element configured to enable a user to perform a user selection to dispute the electronic activity or accept the electronic activity, wherein the user selection to dispute the electronic activity causes an electronic request to dispute the electronic activity verification to prevent an execution of the electronic activity;
causing to display, by the at least one processor, the dispute GUI on a user computing device associated with the user;
receiving, by the at least one processor, a user selection of the user selectable element;
wherein the user selection causes the at least one processor to:
determine a classification error of the anomalous attribute classification machine learning model based on the anomaly classification and the user selection; and
retrain the plurality of trained machine learning model parameters of the anomalous attribute classification machine learning model based at least in part on the error to improve accuracy of the anomaly classification for the user; and
inputting, by the at least one processor, into the anomalous attribute classification machine learning model, a subsequent feature vector of a subsequent electronic activity verification to output a subsequent anomaly classification based at least in part on the plurality of trained machine learning model parameters having been retrained based on the user selection.
|