Multi-Layer Multi-Hit Caching for Long Tail Content

a content and multi-layer technology, applied in the field of content caching, can solve the problems of less uptime for the caching server, increased failure risk of the storage medium decreased performance of the caching server, so as to minimize the effect of long-tail content on cache performance, optimize multi-hit caching, and retain much of the efficiency

Active Publication Date: 2013-08-29
EDGIO INC
View PDF8 Cites 72 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patented technology describes optimizing multiple hits per second (MBS). By doing this, less data needs to be written/pulled than if only one was done beforehand instead of all else, resulting in longer lastings between cache requests and stored contents being served faster over time. Additionally, the system uses caching servers working together more efficiently because they have independent means to handle different types of files based upon their importance level. Overall, this method helps improve overall speed and reduces costs related to storing large amounts of data locally within local networks like web caches.

Problems solved by technology

Technologies described in the technical problem addressed in the patents involve improving caching systems' capabilities against longer tail files due to their high demand rate compared to traditional methods like random retrieval. Additionally, current approaches require significant investment effort and consume considerable amounts of computing power. Therefore, they cannot achieve desired levels of service quality within acceptable limits.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-Layer Multi-Hit Caching for Long Tail Content
  • Multi-Layer Multi-Hit Caching for Long Tail Content
  • Multi-Layer Multi-Hit Caching for Long Tail Content

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051]In the following detailed description, numerous details, examples, and embodiments for systems and methods for optimized multi-hit caching are set forth and described. As one skilled in the art would understand in light of the present description, these systems and methods are not limited to the embodiments set forth, and these systems and methods may be practiced without some of the specific details and examples discussed. Also, reference is made to the accompanying figures, which illustrate specific embodiments in which the systems and methods can be practiced. It is to be understood that other embodiments can be used and structural changes can be made without departing from the scope of the embodiments herein described.

[0052]To aid in the discussion below, an overview for a distributed environment in which multi-hit caching is to be performed is presented in FIG. 2. FIG. 2 presents an exemplary CDN infrastructure that includes a distributed set of caching servers 210, traffic

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Some embodiments provide an optimized multi-hit caching technique that minimizes the performance impact associated with caching of long-tail content while retaining much of the efficiency and minimal overhead associated with first hit caching in determining when to cache content. The optimized multi-hit caching utilizes a modified bloom filter implementation that performs flushing and state rolling to delete indices representing stale content from a bit array used to track hit counts without affecting identification of other content that may be represented with indices overlapping with those representing the stale content. Specifically, a copy of the bit array is stored prior to flushing the bit array so as to avoid losing track of previously requested and cached content when flushing the bit arrays and the flushing is performed to remove the bit indices representing stale content from the bit array and to minimize the possibility of a false positive.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner EDGIO INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products