The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.
The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.
Patent No.:
Date of Patent:
Nov. 26, 2013
Filed:
Nov. 21, 2008
David B. Glasco, Austin, TX (US);
Peter B. Holmqvist, Cary, NC (US);
George R. Lynch, Raleigh, NC (US);
Patrick R. Marchand, Apex, NC (US);
Karan Mehra, Cary, NC (US);
James Roberts, Austin, TX (US);
David B. Glasco, Austin, TX (US);
Peter B. Holmqvist, Cary, NC (US);
George R. Lynch, Raleigh, NC (US);
Patrick R. Marchand, Apex, NC (US);
Karan Mehra, Cary, NC (US);
James Roberts, Austin, TX (US);
Nvidia Corporation, Santa Clara, CA (US);
Abstract
One embodiment of the present invention sets forth a compression status bit cache with deterministic latency for isochronous memory clients of compressed memory. The compression status bit cache improves overall memory system performance by providing on-chip availability of compression status bits that are used to size and interpret a memory access request to compressed memory. To avoid non-deterministic latency when an isochronous memory client accesses the compression status bit cache, two design features are employed. The first design feature involves bypassing any intermediate cache when the compression status bit cache reads a new cache line in response to a cache read miss, thereby eliminating additional, potentially non-deterministic latencies outside the scope of the compression status bit cache. The second design feature involves maintaining a minimum pool of clean cache lines by opportunistically writing back dirty cache lines and, optionally, temporarily blocking non-critical requests that would dirty already clean cache lines. With clean cache lines available to be overwritten quickly, the compression status bit cache avoids incurring additional miss write back latencies.