US 12,218,975 B2
Method and system for processing full-stack network card task based on FPGA
Linge Xiao, Jiangsu (CN); Rui Hao, Jiangsu (CN); and Hongwei Kan, Jiangsu (CN)
Assigned to SUZHOU METABRAIN INTELLIGENT TECHNOLOGY CO., LTD., Jiangsu (CN)
Appl. No. 18/697,423
Filed by SUZHOU METABRAIN INTELLIGENT TECHNOLOGY CO., LTD., Jiangsu (CN)
PCT Filed Sep. 29, 2022, PCT No. PCT/CN2022/122791
§ 371(c)(1), (2) Date Mar. 29, 2024,
PCT Pub. No. WO2023/159957, PCT Pub. Date Aug. 31, 2023.
Claims priority of application No. 202210171789.1 (CN), filed on Feb. 24, 2022.
Prior Publication US 2024/0333766 A1, Oct. 3, 2024
Int. Cl. H04L 9/40 (2022.01); G06F 13/42 (2006.01)
CPC H04L 63/166 (2013.01) [G06F 13/4221 (2013.01); G06F 2213/0026 (2013.01)] 18 Claims
OG exemplary drawing
 
1. A method for processing a full-stack network card task based on Field-Programmable Gate Array (FPGA), the method comprising:
receiving to-be-processed data, and offloading a Transmission Control Protocol (TCP)/Internet Protocol (IP) task from the to-be-processed data by a built-in TCP offload engine, to obtain first processed data;
offloading an Secure Sockets Layer (SSL)/Transport Layer Security (TLS) protocol task from the first processed data, to obtain second processed data; and
acquiring, by a host, dynamic configuration information of a Partial Reconfiguration (PR) region where the second processed data is located, and configuring the PR region based on the dynamic configuration information, so that the PR region offloads and processes computation-intensive tasks in the second processed data;
wherein the PR region is a neural network model or an image inference model.