会员体验
专利管家(专利管理)
工作空间(专利管理)
风险监控(情报监控)
数据分析(专利分析)
侵权分析(诉讼无效)
联系我们
交流群
官方交流:
QQ群: 891211   
微信请扫码    >>>
现在联系顾问~
热词
    • 21. 发明申请
    • NETWORK CACHING FOR MULTIPLE CONTEMPORANEOUS REQUESTS
    • 网络播放多个同步请求
    • US20100268789A1
    • 2010-10-21
    • US12425395
    • 2009-04-17
    • Won Suk YooAnil K. RuiaHimanshu PatelJohn A. BocharovNing Lin
    • Won Suk YooAnil K. RuiaHimanshu PatelJohn A. BocharovNing Lin
    • G06F15/167
    • H04L67/2842H04L67/2833H04L67/2885
    • A live caching system is described herein that reduces the burden on origin servers for serving live content. In response to receiving a first request that results in a cache miss, the system forwards the first request to the next tier while “holding” other requests for the same content. If the system receives a second request while the first request is pending, the system will recognize that a similar request is outstanding and hold the second request by not forwarding the request to the origin server. After the response to the first request arrives from the next tier, the system shares the response with other held requests. Thus, the live caching system allows a content provider to prepare for very large events by adding more cache hardware and building out a cache server network rather than by increasing the capacity of the origin server.
    • 这里描述了实时缓存系统,其减少了用于服务实况内容的原始服务器的负担。 响应于接收到导致高速缓存未命中的第一请求,系统将第一请求转发到下一层,同时“保持”其他对相同内容的请求。 如果系统在第一个请求未决时接收到第二个请求,则系统将识别出类似的请求未完成,并且通过不将请求转发给原始服务器来保持第二个请求。 在第一个请求的响应从下一个层次到达之后,系统与其他持有的请求共享响应。 因此,实时缓存系统允许内容提供商通过添加更多的高速缓存硬件和构建缓存服务器网络来准备非常大的事件,而不是增加源服务器的容量。