| CPC G06F 40/35 (2020.01) [G06F 16/243 (2019.01); G06F 16/322 (2019.01); G06F 16/3329 (2019.01); G06F 16/951 (2019.01); G06F 40/123 (2020.01); G06F 40/126 (2020.01); G06F 40/20 (2020.01); G06F 40/205 (2020.01); G06F 40/211 (2020.01); G06F 40/226 (2020.01); G06F 40/242 (2020.01); G06F 40/279 (2020.01); G06F 40/30 (2020.01); G06F 40/45 (2020.01); G06F 40/47 (2020.01); G06F 40/58 (2020.01); G06N 3/0442 (2023.01); G06N 3/0455 (2023.01); G06N 3/0499 (2023.01); G06N 3/08 (2013.01); G06N 5/02 (2013.01); G06Q 10/1053 (2013.01); G06Q 30/0255 (2013.01); G06Q 30/0257 (2013.01); G06Q 30/0631 (2013.01); G10L 15/16 (2013.01); G10L 15/1815 (2013.01); G10L 15/22 (2013.01); G10L 15/26 (2013.01); G10L 25/63 (2013.01); G16H 10/60 (2018.01); H04L 51/02 (2013.01); G06N 3/091 (2023.01); G10L 2015/088 (2013.01)] | 28 Claims |

|
1. A computer implemented method for the automated analysis or use of data, comprising the steps of:
(a) storing in a non-transitory computer-readable medium a structured, machine-readable representation of data that conforms to a machine-readable language, in which the data relates to user speech or text information input to a human/machine interface;
(b) automatically processing the structured machine-readable representation to analyse the user speech or text information;
(c) identifying a first wake word in the analysed user speech or text information, and initiating processing in response to identifying the first wake word, and entering a privacy-preserving state after the initiating processing; and
(d) after entering the privacy-preserving state, identifying a second wake word in the analysed user speech or text information, and continuing processing in response to identifying the second wake word, wherein the second wake word is sufficiently long or unusual that a false recognition of the second wake word is significantly more improbable than a false recognition of the first wake word;
in which a neural architecture is used to generate the machine-readable language and the neural architecture utilises recurrent neural networks or long short-term memories (LSTMs) or attention mechanisms or transformers.
|