As media companies focus on growing their D2C businesses, the ecosystem that enables content delivery over IP-based networks continues to evolve and become even more critical. In order to gain efficiencies in the process, media technology providers want to better leverage each component in the content delivery path. Caching popular content at the network edge—making it closer to end users—is a common ISP network management practice that has long been a part of the Internet’s architecture. But the new Open Caching standard enables a move away from proprietary systems, while still supporting an environment that leverages content delivery networks, last mile providers and streaming technologies that bring content closer to the consumer. Based on specifications developed by the Streaming Video Alliance's (SVA) Open Caching Working Group, networks are deploying caching servers directly at the edge of the network, instead of servers in cloud data centers located further away from where customers are engaging with the content. By caching content closer to customers, the data travels a shorter distance over the network, through fewer routers and switches, reducing the time it takes to reach the end device. As a result, consumers are able to decrease time to the first frame of the experience and face fewer potential network events that cause freezing or buffering. This presentation will review the latest OC specifications, illustrate the interoperability of the SVA’s APIs, and highlight a recent field test in which customers throughout the FiOS footprint were able to access select content using Open Caching and experienced faster start times, smoother streaming, and less buffering.