US 12,190,061 B2
System and methods for neural topic modeling using topic attention networks
Shashank Shailabh, Bihar (IN); Madhur Panwar, Uttar Pradesh (IN); Milan Aggarwal, Delhi (IN); Pinkesh Badjatiya, Madhya Pradesh (IN); Simra Shahid, Uttar Pradesh (IN); Nikaash Puri, New Delhi (IN); S Sejal Naidu, Madhya Pradesh (IN); Sharat Chandra Racha, Telangana (IN); Balaji Krishnamurthy, Uttar Pradesh (IN); and Ganesh Karbhari Palwe, Uttar Pradesh (IN)
Assigned to ADOBE INC., San Jose, CA (US)
Filed by ADOBE INC., San Jose, CA (US)
Filed on Dec. 17, 2021, as Appl. No. 17/644,856.
Claims priority of provisional application 63/264,688, filed on Nov. 30, 2021.
Prior Publication US 2023/0169271 A1, Jun. 1, 2023
Int. Cl. G06F 40/289 (2020.01); G06F 40/30 (2020.01); G06F 40/40 (2020.01)
CPC G06F 40/289 (2020.01) [G06F 40/30 (2020.01); G06F 40/40 (2020.01)] 20 Claims
OG exemplary drawing
 
12. A method for topic modeling, comprising:
encoding words of a document using an embedding matrix to obtain word embeddings for the document, wherein the words of the document comprise a subset of words in a vocabulary;
generating a sequence of hidden representations corresponding to the word embeddings using a sequential encoder, wherein the sequence of hidden representations comprises an order based on an order of the words in the document;
computing a context vector for the document based on the sequence of hidden representations;
generating a latent vector based on the context vector using an auto-encoder;
computing a loss function based on the latent vector;
updating parameters of the embedding matrix and a topic attention network based on the loss function; and
predicting, using the embedding matrix and the topic attention network, a set of words including a topic for an input document based on a context vector for the input document.