Enhancing Video Streaming Using Real-Time Gaze Tracking


Enhancing Video Streaming Using Real-Time Gaze Tracking



Sebastian Arndt; Jan-Niklas Antons



While users are watching videos, they can only focus and evaluate one small part of the presented video frame which is within the focal view point. Video coding however often assumes equal distribution of attention or has predefined areas of interest, regardless of where the observer is actually looking at. In this paper, we propose a system that uses real-time information of eye gaze in order to perform video coding more appropriate to the viewers’ attention focus. Using subjective quality evaluation performed after using a prototype, we show that test participants evaluate attention focus based coding better than traditional coding. On the long run this result can stimulate more research in the domain of real-time gaze based video coding.

The paper can be downloaded from the ISCA archive.