The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.
The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.
Patent No.:
Date of Patent:
Oct. 04, 1994
Filed:
Apr. 29, 1992
Rajiv N Patel, San Jose, CA (US);
Adam Malamy, Winchester, MA (US);
Norman M Hayes, Sunnyvale, CA (US);
Sun Microsystems, Inc., Mountain View, CA (US);
Abstract
A cache array, a cache tag and comparator unit and a cache multiplexor are provided to a cache memory. Each cache operation performed against the cache array, read or write, takes only half a clock cycle. The cache tag and comparator unit comprises a cache tag array, a cache miss buffer and control logic. Each cache operation performed against the cache tag array, read or write, also takes only half a clock cycle. The cache miss buffer comprises cache miss descriptive information identifying the current state of a cache fill in progress. The control logic comprises a plurality of combinatorial logics for performing tag match operations. In addition to standard tag match operations, the control logic also conditionally tag matches an accessing address against an address tag stored in the cache miss buffer. Depending on the results of the tag match operations, and further depending on the state of the current cache fill if the accessing address is part of the memory block frame of the current cache fill, the control logic provides appropriate signals to the cache array, the cache multiplexor, the main memory and the instruction/data destination. As a result, subsequent instruction/data requests that are part of a current cache fill in progress can be satisfied without having to wait for the completion of the current cache fill, thereby further reducing cache miss penalties and function unit idle time.