Real-time Audio-Visual Media Transport over QUIC

I gave a presentation at the ACM CoNEXT Workshop on Evolution, Performance, and Interoperability of QUIC yesterday, discussing how best to transport real-time traffic over the emerging QUIC transport protocol.

Presentation on Real-time Audio-Visual Media Transport over QUIC

The paper I presented, co-authored with Jörg Ott, reviews the development of Internet real-time media transport protocols, looking at HTTP adaptive streaming (e.g., MPEG DASH) and RTP-based media. It considers the strengths and weaknesses of those protocols, considering especially features that affect their performance and ease of deployment. We review some previous proposals for adding datagrams and real-time support to QUIC, then present the outline of a new design that incorporates the essential features needed to support real-time into QUIC.

This proposal adds an RT_STREAM abstraction to QUIC, allowing it to deliver frames of real-time data. We incorporate the concepts of application-level framing for robustness against packet loss, and also providing timing and sequencing information in the media frames. The minimal extensions we propose allow the receiver to reconstruct media timing and to detect and recover from packet loss. It also provides the basis for partial reliability and adaptive media aware congestion control. The paper outlines the key design issues, and we expect to submit a more detailed design document to the IETF for discussion in the next few weeks.

Opinions expressed are my own, and do not represent those of my employers or the organisations that fund my research.