US 12,380,032 B2
Memory-aware pre-fetching and cache bypassing systems and methods
David Andrew Roberts, Wellesley, MA (US)
Filed by Lodestar Licensing Group, LLC, Evanston, IL (US)
Filed on Feb. 15, 2024, as Appl. No. 18/442,676.
Application 18/442,676 is a continuation of application No. 17/543,378, filed on Dec. 6, 2021, granted, now 11,934,317.
Application 17/543,378 is a continuation of application No. 16/525,106, filed on Jul. 29, 2019, granted, now 11,194,728, issued on Dec. 7, 2021.
Prior Publication US 2024/0248849 A1, Jul. 25, 2024
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 12/0888 (2016.01); G06F 12/0862 (2016.01); G06F 12/0897 (2016.01)
CPC G06F 12/0888 (2013.01) [G06F 12/0862 (2013.01); G06F 12/0897 (2013.01)] 18 Claims
OG exemplary drawing
 
1. A computing system comprising:
a processor configured to generate a first request targeting a first memory cell;
a memory bus coupled to the processor;
a memory array comprising a plurality of memory pages, wherein a first memory page of the plurality of memory pages comprises a first plurality of memory cells, and the first plurality of memory cells comprises the first memory cell; and
a memory controller coupled to the memory bus and the memory array, wherein the memory controller is configured to:
determine a first parameter associated with a probability of the processor to
generate a number of successive requests targeting the first plurality of memory cells; and disable caching or pre-fetching the first plurality of memory cells in response to the first request based on the first parameter being equal to or above a threshold, and
wherein the processor is configured to generate a second request targeting a second memory cell of a second plurality of memory cells after generating the first request, wherein a second memory page of the plurality of memory pages comprises the second plurality of memory cells, and wherein the memory controller is configured to:
determine a second parameter associated with a probability of the processor to generate a number of successive requests targeting the second memory page;
enable caching or pre-fetching the second plurality of memory cells in response to the second request based on the first parameter being greater than the second parameter; and
disable caching or pre-fetching the second plurality of memory cells in response to the second request based on the second parameter being greater than the first parameter.