Weston, MA, United States of America

Steve Mark Lorusso


Average Co-Inventor Count = 10.0

ph-index = 1

Forward Citations = 3(Granted Patents)


Company Filing History:


Years Active: 2024

where 'Filed Patents' based on already Granted Patents

1 patent (USPTO):

Title: Innovations by Steve Mark Lorusso

Introduction

Steve Mark Lorusso is an accomplished inventor based in Weston, MA. He has made significant contributions to the field of acoustic event detection. His innovative approach combines technology with user interaction to enhance the detection of custom acoustic events.

Latest Patents

Lorusso holds a patent for an acoustic event detection system. This system is designed to detect custom acoustic events by generating an acoustic event profile based on a natural language description provided by the user. For instance, if a user describes a custom acoustic event as a "dog bark," the system can refine this description by asking questions about the dog's breed, gender, and age. Using an audio sample of the refined description, the system can identify potential occurrences of the described event in the user's environment.

Career Highlights

Lorusso is currently employed at Amazon Technologies, Inc., where he continues to develop innovative solutions in technology. His work focuses on enhancing user experience through advanced acoustic detection systems. His contributions have been instrumental in pushing the boundaries of what is possible in this field.

Collaborations

Some of Lorusso's notable coworkers include Qin Zhang and Qingming Tang. Their collaboration has fostered an environment of innovation and creativity, leading to the development of cutting-edge technologies.

Conclusion

Steve Mark Lorusso's work in acoustic event detection exemplifies the intersection of technology and user engagement. His innovative solutions have the potential to transform how we interact with sound in our environments.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…