US 12,111,761 B2
Memory cache management based on storage capacity for parallel independent threads
Luca Bert, San Jose, CA (US)
Assigned to Micron Technology, Inc., Boise, ID (US)
Filed by MICRON TECHNOLOGY, INC., Boise, ID (US)
Filed on Mar. 7, 2022, as Appl. No. 17/688,506.
Application 17/688,506 is a continuation of application No. 16/922,959, filed on Jul. 7, 2020, granted, now 11,275,687.
Prior Publication US 2022/0188231 A1, Jun. 16, 2022
Int. Cl. G06F 12/0808 (2016.01); G06F 12/0871 (2016.01)
CPC G06F 12/0808 (2013.01) [G06F 12/0871 (2013.01); G06F 2212/50 (2013.01)] 17 Claims
OG exemplary drawing
 
1. A method comprising:
programing data items to memory pages of a cache that resides in a first portion of a memory device, wherein each of the data items programmed to the memory pages are associated with at least one of a plurality of processing threads of a host system;
responsive to determining that a threshold number of memory pages, associated with a first processing thread, are programmed in the cache, identifying, in a first block of the cache, at least one first memory page including a first data associated with the first processing thread and, in a second block of the cache, at least one second memory page including a second data item associated with the first processing thread;
copying the first data item and the second data item to a second portion of the memory device while retaining, in each of the first block and the second block, other data items of other processing threads, wherein the second portion of the memory device is provisioned as host space that is addressable by the host system; and
designating the first memory page and the at least one second memory page as invalid.