The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Mar. 07, 2000

Filed:

Jun. 02, 1999
Applicant:
Inventors:

Chia-Chang Hsu, Yung-kang, TW;

Ruey-Liang Ma, Lo-Tung, TW;

Chien-kuo Tien, Taipei, TW;

Kun-Cheng Wu, Hsinchu, TW;

Attorney:
Primary Examiner:
Int. Cl.
CPC ...
G06F / ;
U.S. Cl.
CPC ...
712210 ; 712204 ;
Abstract

A processor architecture is disclosed including a fetcher, packet unit and branch target buffer. The branch target buffer is provided with a tag RAM that is organized in a set associative fashion. In response to receiving a search address, multiple sets in the tag RAM are simultaneously searched for a branch instruction that is predicted to be taken. The packet unit has a queue into which fetched cache blocks are stored containing instructions. Sequentially fetched cache blocks are stored in adjacent locations of the queue. The queue entries also have indicators that indicate whether or not a starting or final data word of an instruction sequence is contained in the queue entry and if so, an offset indicating the particular starting or final data word. In response, the packet unit concatenates data words of an instruction sequence into contiguous blocks. The fetcher generates a fetch address for fetching a cache block from the instruction cache containing instructions to be executed. The fetcher also generates a search address for output to the branch target buffer. In response to the branch target buffer detecting a taken branch that crosses multiple cache blocks, the fetch address is increased so that it points to the next cache block to be fetched but the search address is maintained the same.


Find Patent Forward Citations

Loading…