Int J Performability Eng ›› 2018, Vol. 14 ›› Issue (5): 965-974.doi: 10.23940/ijpe.18.05.p15.965974

• Original articles • Previous Articles     Next Articles

Quality Assessment of Sport Videos

Zhenqing Liu   

  1. Department of Physical Education, Beijing University of Technology, Beijing, 100124, China

Abstract:

Considering that in sport videos the adjacent frames tend to have great similarity, this paper mainly extracted and analyzed the video frames which are most important for the user to perceive quality as a test sequence and propose a fully reference assessment method based on the temporal features and spatial features. Sports videos contain more details, and pictures change sharply. According to this characteristic, the method mainly used the SI (Spatial perceptual Information) and TI (Temporal perceptual Information) to analyze the feature of every frame of ESPN sport videos. Through the analysis of SI and TI, this paper extracted frames with high temporal perceptual Information and high spatial perceptual information as a test sequence. Then, every frame in the sequence would be test referring to its original corresponding frame to calculate PSNR (Peak signal-to-noise ratio). Finally, this paper calculated the average PSNR as the video quality assessment standards. This paper took rugby, basketball and hockey as experimental subjects. Through analyzing the PSNR of videos corresponding to different quality levels (better quality, general quality and poor quality), this paper determined the PSNR scopes of different quality levels that can be used practically. The experimental results showed that the analysis method put forward in this paper based on the characteristics of SI and TI could be used on ESPN sports video network platforms and others like it. It automatic analyzed and judged sports video quality of different bit rates in real time. It has a high Spearman rank order correlation coefficient (SROCC) with the subjective quality assessment.


Submitted on February 9, 2018; Revised on March 15, 2018; Accepted on April 15, 2018
References: 9