Deep learning using large codeword model with homomorphically compressed data
Brian Galvin, Silverdale, WA (US)
Assigned to ATOMBEAM TECHNOLOGIES INC, Moraga, CA (US)
Filed by AtomBeam Technologies Inc., Moraga, CA (US)
Filed on Oct. 9, 2024, as Appl. No. 18/909,976.
Application 18/909,976 is a continuation in part of application No. 18/770,652, filed on Jul. 12, 2024.
Application 18/909,976 is a continuation in part of application No. 18/755,653, filed on Jun. 26, 2024.
Application 18/909,976 is a continuation in part of application No. 18/737,906, filed on Jun. 7, 2024.
Application 18/909,976 is a continuation in part of application No. 18/736,498, filed on Jun. 6, 2024.
Application 18/755,653 is a continuation in part of application No. 18/657,683, filed on May 7, 2024, granted, now 12,159,216.
Application 18/657,683 is a continuation in part of application No. 18/648,340, filed on Apr. 27, 2024, granted, now 12,166,507.
Application 18/648,340 is a continuation in part of application No. 18/427,716, filed on Jan. 30, 2024, granted, now 12,093,972, issued on Sep. 17, 2024.
Application 18/427,716 is a continuation in part of application No. 18/410,980, filed on Jan. 11, 2024, granted, now 12,068,761, issued on Aug. 20, 2024.
Application 18/410,980 is a continuation in part of application No. 18/537,728, filed on Dec. 12, 2023, granted, now 12,058,333, issued on Aug. 6, 2024.
Application 18/770,652 is a continuation in part of application No. 18/503,135, filed on Nov. 6, 2023.
Application 18/503,135 is a continuation of application No. 18/305,305, filed on Apr. 21, 2023, granted, now 11,811,428, issued on Nov. 7, 2023.
Application 18/305,305 is a continuation in part of application No. 18/190,044, filed on Mar. 24, 2023, granted, now 11,831,343, issued on Nov. 28, 2023.
Application 18/190,044 is a continuation in part of application No. 17/875,201, filed on Jul. 27, 2022, granted, now 11,700,013, issued on Jul. 11, 2023.
Application 18/190,044 is a continuation in part of application No. 17/727,913, filed on Apr. 25, 2022, granted, now 11,620,051, issued on Apr. 4, 2023.
Application 17/875,201 is a continuation of application No. 17/514,913, filed on Oct. 29, 2021, granted, now 11,424,760, issued on Aug. 23, 2022.
Application 17/875,201 is a continuation of application No. 17/458,747, filed on Aug. 27, 2021, granted, now 11,422,978, issued on Aug. 23, 2022.
Application 17/727,913 is a continuation of application No. 17/404,699, filed on Aug. 17, 2021, granted, now 11,385,794, issued on Jul. 12, 2022.
Application 17/514,913 is a continuation in part of application No. 17/404,699, filed on Aug. 17, 2021, granted, now 11,385,794, issued on Jul. 12, 2022.
Application 18/305,305 is a continuation in part of application No. 17/234,007, filed on Apr. 19, 2021, granted, now 11,782,879, issued on Oct. 10, 2023.
Application 17/234,007 is a continuation in part of application No. 17/180,439, filed on Feb. 19, 2021, granted, now 11,366,790, issued on Jun. 21, 2022.
Application 17/180,439 is a continuation in part of application No. 16/923,039, filed on Jul. 7, 2020, granted, now 11,232,076, issued on Jan. 25, 2022.
Application 17/458,747 is a continuation in part of application No. 16/923,039, filed on Jul. 7, 2020, granted, now 11,232,076, issued on Jan. 25, 2022.
Application 16/923,039 is a continuation in part of application No. 16/716,098, filed on Dec. 16, 2019, granted, now 10,706,018, issued on Jul. 7, 2020.
Application 16/716,098 is a continuation of application No. 16/455,655, filed on Jun. 27, 2019, granted, now 10,509,771, issued on Dec. 17, 2019.
Application 17/404,699 is a continuation in part of application No. 16/455,655, filed on Jun. 27, 2019, granted, now 10,509,771, issued on Dec. 17, 2019.
Application 16/455,655 is a continuation in part of application No. 16/200,466, filed on Nov. 26, 2018, granted, now 10,476,519, issued on Nov. 12, 2019.
Application 16/200,466 is a continuation in part of application No. 15/975,741, filed on May 9, 2018, granted, now 10,303,391, issued on May 28, 2019.
Claims priority of provisional application 63/651,359, filed on May 23, 2024.
Claims priority of provisional application 63/485,518, filed on Feb. 16, 2023.
Claims priority of provisional application 63/388,411, filed on Jul. 12, 2022.
Claims priority of provisional application 63/232,041, filed on Aug. 11, 2021.
Claims priority of provisional application 63/140,111, filed on Jan. 21, 2021.
Claims priority of provisional application 63/027,166, filed on May 19, 2020.
Claims priority of provisional application 62/926,723, filed on Oct. 28, 2019.
Claims priority of provisional application 62/578,824, filed on Oct. 30, 2017.
Prior Publication US 2025/0047295 A1, Feb. 6, 2025
1. A system for deep learning using a large codeword model with homomorphically compressed dyadically encrypted data, comprising:
a computing device comprising at least a memory and a processor;
a plurality of programming instructions stored in the memory and operable on the processor, wherein the plurality of programming instructions, when operating on the processor, cause the computing device to:
receive a plurality of inputs;
preprocess the inputs to generate a plurality of input data sets;
compress and encrypt the input data sets by:
analyzing the input data sets to determine their properties;
creating transformation matrices based on the properties of the input data;
transforming the input data into modified distributions;
generating main data streams of transformed data and secondary data streams of transformation information; and
compressing the main data streams;
tokenize the compressed main data streams into a plurality of sourceblocks;
assign the plurality of sourceblocks a plurality of codewords, where each sourceblock is mapped to a particular codeword through a codebook;
process the plurality of codewords through a machine learning core to generate a codeword response;
translate the codeword response into a translated response which matches the modality of the inputs;
decompress and decrypt the translated response; and
train the machine learning core using the decompressed and decrypted response and a plurality of training data.