The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Jul. 28, 1998

Filed:

Jun. 03, 1996
Applicant:
Inventors:

Ronald M Kaplan, Palo Alto, CA (US);

Atty T Mullins, Tucson, AZ (US);

Assignee:

Xerox Corporation, Stamford, CT (US);

Attorney:
Primary Examiner:
Assistant Examiner:
Int. Cl.
CPC ...
G06F / ; G06F / ;
U.S. Cl.
CPC ...
704-8 ; 704-2 ; 704-9 ; 704 10 ; 707101 ; 707102 ; 707-5 ; 707532 ; 707536 ;
Abstract

A computerized multilingual translation dictionary includes a set of word and phrases for each of the languages it contains, plus a mapping that indicates for each word or phrase in one language what the corresponding translations in the other languages are. The set of words and phrases for each language are divided up among corresponding concept groups based on an abstract pivot language. The words and phrases are encoded as token numbers assigned by a word-number mapper laid out in sequence that can be searched fairly rapidly with a simple linear scan. The complex associations of words and phrases to particular pivot language senses are represented by including a list of pivot-language sense numbers with each word or phrase. The preferred coding of these sense numbers is by means of a bit vector for each word, where each bit corresponds to a particular pivot element in the abstract language, and the bit is ON if the given word is a translation of that pivot element. Then, to determine whether a word in language 1 translates to a word in language 2 only requires a bit-wise intersection of their associated bit-vectors. Each word or phrase is prefixed by its bit-vector token number, so the bit-vector tokens do double duty: they also act as separators between the tokens of one phrase and those of another. A pseudo-Huffman compression scheme is used to reduce the size of the token stream. Because of the frequency skew for the bit-vector tokens, this produces a very compact encoding.


Find Patent Forward Citations

Loading…