| CPC G06F 40/103 (2020.01) [G06F 40/177 (2020.01); G06T 11/206 (2013.01); G06V 30/40 (2022.01)] | 18 Claims |

|
7. A computer-implemented method comprising:
obtaining a target editable object associated with a first data set, from a presentation computer application, the target editable object;
calculating a similarity measure for a plurality of predefined editable objects by comparing the plurality of predefined editable objects to the target editable object, each predefined editable object of the plurality of predefined editable objects associated with a second data set, wherein the similarity is measured based on data of the first data set and the second data set, semantic information associated with the first data set and the second data set, or categories of the first data set and the second data set;
identifying one or more predefined editable objects from the plurality of predefined editable objects based on a similarity measurement of the one or more predefined editable objects being outside a threshold similarity measure;
extracting a style of the one or more predefined editable objects by a style parser of a neural network;
outputting the one or more predefined editable objects in a user interface of the presentation computer application; and
upon receipt of a selection of the one or more predefined editable objects, applying the style of the one or more predefined editable objects to the target editable object by a style adapter of the neural network in an application window of the presentation computer application, wherein the style of the one or more predefined editable objects comprises at least one visual display characteristic.
|