The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Jul. 28, 2020

Filed:

Feb. 05, 2019
Applicant:

Arm Limited, Cambridge, GB;

Inventors:

Lei Ma, Cambridge, GB;

Alexander Alfred Hornung, Cambridge, GB;

Ian Michael Caulfield, Cambridge, GB;

Assignee:

Arm Limited, Cambridge, GB;

Attorney:
Primary Examiner:
Int. Cl.
CPC ...
G06F 12/08 (2016.01); G06F 12/0871 (2016.01); G06F 12/0877 (2016.01); G06F 12/0895 (2016.01);
U.S. Cl.
CPC ...
G06F 12/0871 (2013.01); G06F 12/0877 (2013.01); G06F 12/0895 (2013.01);
Abstract

An apparatus comprises a cache memory to store data as a plurality of cache lines each having a data size and an associated physical address in a memory, access circuitry to access the data stored in the cache memory, detection circuitry to detect, for at least a set of sub-units of the cache lines stored in the cache memory, whether a number of accesses by the access circuitry to a given sub-unit exceeds a predetermined threshold, in which each sub-unit has a data size that is smaller than the data size of a cache line, prediction circuitry to generate a prediction, for a given region of a plurality of regions of physical address space, of whether data stored in that region comprises streaming data in which each of one or more portions of the given cache line is predicted to be subject to a maximum of one read operation or multiple access data in which each of the one or more portions of the given cache line is predicted to be subject to more than one read operation, the prediction circuitry being configured to generate the prediction in response to a detection by the detection circuitry of whether the number of accesses to a sub-unit of a cache line having an associated physical address in the given region exceeds the predetermined threshold, and allocation circuitry to selectively allocate a next cache line to the cache memory in dependence upon the prediction applicable to the region of physical address space containing that next cache line.


Find Patent Forward Citations

Loading…