US 12,321,276 B2
Prefetching cached data for predicted accesses
Gabriel Zvi BenHanokh, Tel-Aviv (IL); and Yehoshua Salomon, Kfar Saba (IL)
Assigned to Red Hat, Inc., Raleigh, NC (US)
Filed by Red Hat, Inc., Raleigh, NC (US)
Filed on Aug. 31, 2022, as Appl. No. 17/900,615.
Prior Publication US 2024/0070079 A1, Feb. 29, 2024
Int. Cl. G06F 12/0862 (2016.01)
CPC G06F 12/0862 (2013.01) [G06F 2212/602 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
identifying, by a computing device, one or more requested data items requested by a client system;
identifying, in view of the one or more requested data items, a plurality of predicted data items and, for each predicted data item, a respective probability that a predicted data item of the plurality of predicted data items will be requested by a subsequent access request;
identifying a plurality of cacheable data items, wherein the plurality of cacheable data items comprise one or more of the plurality of predicted data items, wherein each cacheable data item comprises the predicted data item that satisfies caching criteria, wherein the caching criteria are evaluated in view of the respective probability that the predicted data item will be requested by the subsequent access request, wherein the predicted data item satisfies the caching criteria if the respective probability that the predicted data item will be requested based on a load of a respective storage server at which the predicted data item is located satisfies a weight threshold; and
storing each of the plurality of cacheable data items in a cache memory of a respective storage server at which the cacheable data item is located.