The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.
The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.
Patent No.:
Date of Patent:
Jul. 12, 1994
Filed:
Nov. 25, 1992
David B Kirk, South Pasadena, CA (US);
Douglas A Kerns, Pasadena, CA (US);
Brooke P Anderson, Pasadena, CA (US);
Kurt Fleischer, Pasadena, CA (US);
Alan H Barr, Pasadena, CA (US);
Other;
Abstract
A circuit and method for estimating gradients of a target function using noise injection and correlation is provided. In one embodiment, an input signal is combined with an input noise signal and the combined signal is input to a circuit which computes the output of the target function. An amplified noise signal and output signal of the target function are input to a multiplier which performs a correlation of the inputs. The output of the multiplier is processed by a low-pass filter which generates the gradient. The circuit and method can be expanded to N-dimensions. Furthermore, in a alternate embodiment, a differentiator is coupled between the multiplier and amplifier and the multiplier and the output of the target function to differentiate the two signals prior to input to the multiplier. In other embodiments, the circuit may be used to compute gradient-like signals, wherein each component of the gradient is individually scaled by a different value. The output of the circuit can then be used in other descent algorithms. In addition, varying the scale of the noise signal over a time schedule, an annealing style of optimization can be implemented. This prevents the gradient process from stopping at local minima while descending to the global minimum of the function.