The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Feb. 18, 2020

Filed:

Feb. 20, 2018
Applicant:

Verizon Digital Media Services Inc., Playa Vista, CA (US);

Inventors:

Jonathan DiVincenzo, Los Angeles, CA (US);

Seungyeob Choi, Northridge, CA (US);

Karthik Sathyanarayana, Los Angeles, CA (US);

Robert J. Peters, Santa Monica, CA (US);

Eric Dyoniziak, Streamwood, IL (US);

Assignee:

Verizon Digital Media Services Inc., Playa Vista, CA (US);

Attorneys:
Primary Examiner:
Int. Cl.
CPC ...
H04L 29/08 (2006.01); H04N 21/845 (2011.01); H04N 21/2187 (2011.01); H04N 21/231 (2011.01);
U.S. Cl.
CPC ...
H04L 67/1014 (2013.01); H04L 67/1008 (2013.01); H04L 67/2847 (2013.01); H04N 21/2187 (2013.01); H04N 21/23106 (2013.01); H04N 21/8456 (2013.01); H04L 67/02 (2013.01);
Abstract

Some embodiments provide intelligent predictive stream caching for live, linear, or video-on-demand streaming content using prefetching, segmented caching, and request clustering. Prefetching involves retrieving streaming content segments from an origin server prior to the segments being requested by users. Prefetching live or linear streaming content segments involves continually reissuing requests to the origin until the segments are obtained or a preset retry duration is completed. Prefetching is initiated in response to a first request for a segment falling within a particular interval. Request clustering commences thereafter. Subsequent requests are queued until the segments are retrieved. Segmented caching involves caching segments for one particular interval. Segments falling within a next interval are not prefetched until a first request for one such segment in the next interval is received. Cached segments from the previous interval can be replaced in cache with segments for the current interval, thereby minimizing cache footprint utilization.


Find Patent Forward Citations

Loading…