US 12,436,966 B1
Utilizing a large language model to generate a computing structure
Eric B. Hensley, San Francisco, CA (US); and Mark C. Wolochuk, Portland, OR (US)
Assigned to Aravo Solutions, Inc., San Francisco, CA (US)
Filed by Aravo Solutions, Inc., San Francisco, CA (US)
Filed on Sep. 23, 2024, as Appl. No. 18/893,706.
Int. Cl. G06F 16/00 (2019.01); G06F 16/25 (2019.01)
CPC G06F 16/258 (2019.01) 20 Claims
OG exemplary drawing
 
1. A method for generating a computing structure based on both unstructured data and structured data using a large language model (LLM), the method comprising:
receiving, using one or more computing device processors, first unstructured data from a first data source;
receiving, using the one or more computing device processors, first structured data from a second data source;
determining, using the one or more computing device processors, a first computing library associated with the first structured data;
receiving, using the one or more computing device processors, second unstructured data from a third data source;
determining, using the one or more computing device processors, a first set of computing prompts associated with the second unstructured data;
receiving, using the one or more computing device processors, second structured data, associated with a first computing format, from a fourth data source;
determining, using the one or more computing device processors, based on the first computing format, a set of computing structures associated with the second structured data;
transmitting, using the one or more computing device processors, at a first time, the first unstructured data to an LLM;
transmitting, using the one or more computing device processors, at a second time or the first time, the first computing library associated with the first structured data to the LLM;
transmitting, using the one or more computing device processors, at a third time, the second time, or the first time, the first set of computing prompts associated with the second unstructured data to the LLM;
transmitting, using the one or more computing device processors, at a fourth time, the third time, the second time, or the first time, the set of computing structures associated with the second structured data to the LLM;
receiving, using the one or more computing device processors, third structured data, associated with a second computing format, from the LLM, wherein the third structured data comprises or is based on the first set of computing prompts associated with the second unstructured data, a set of responses associated with the first set of computing prompts associated with the second unstructured data, and a computing structure, wherein the computing structure is not comprised in the set of computing structures associated with the second structured data, and wherein the computing structure comprises or is based on the first unstructured data, the first computing library associated with the first structured data, the first set of computing prompts associated with the second unstructured data, and the set of computing structures associated with the second structured data; and
transmitting, using the one or more computing device processors, the third structured data to a first system.