US 11,860,707 B2
Current prediction-based instruction throttling control
Brian Thomas Vanderpool, Byron, MN (US); Gerald Mark Grabowski, Kellogg, MN (US); Jeffrey A. Stuecheli, Austin, TX (US); Michael Stephen Floyd, Cedar Park, TX (US); and Matthew A. Cooke, Cedar Park, TX (US)
Assigned to International Business Machines Corporation, Armonk, NY (US)
Filed by International Business Machines Corporation, Armonk, NY (US)
Filed on Feb. 15, 2023, as Appl. No. 18/169,274.
Application 18/169,274 is a continuation of application No. 17/460,163, filed on Aug. 28, 2021, granted, now 11,625,087.
Prior Publication US 2023/0195202 A1, Jun. 22, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 1/32 (2019.01); G06F 1/3234 (2019.01); G06F 1/3206 (2019.01)
CPC G06F 1/3243 (2013.01) [G06F 1/3206 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A computer system for managing energy consumption of a plurality of processor cores in a multicore processing device comprising:
a hardware-based control system comprising:
one or more power proxy modules, wherein each power proxy module of the one or more power proxy modules is configured to:
translate an activity level of one or more processor cores to a charge value;
one or more charge value accumulators, wherein each charge value accumulator of the one or more charge value accumulators is communicatively coupled to a respective power proxy module, each charge value accumulator is configured to:
accumulate the charge values from the respective power proxy module; and
generate, at least partially subject to the accumulated charge values, one or more charge replenishment requests associated with the respective one or more processor cores;
a delay queue configured to receive the one or more charge replenishment requests, wherein the delay queue is operably coupled to the one or more charge value accumulators; and
a pending queue configured to receive the charge replenishment requests prior to the delay queue, wherein the pending queue is communicatively coupled to the delay queue.