“Emulating displays with continuously varying frame rates”

  • ©Krzysztof Templin, Piotr Didyk, Karol Myszkowski, and Hans-Peter Seidel




    Emulating displays with continuously varying frame rates





    The visual quality of a motion picture is significantly influenced by the choice of the presentation frame rate. Increasing the frame rate improves the clarity of the image and helps to alleviate many artifacts, such as blur, strobing, flicker, or judder. These benefits, however, come at the price of losing well-established film aesthetics, often referred to as the “cinematic look”. Current technology leaves artists with a sparse set of choices, e.g., 24 Hz or 48 Hz, limiting the freedom in adjusting the frame rate to artistic needs, content, and display technology. In this paper, we solve this problem by proposing a novel filtering technique which enables emulating the whole spectrum of presentation frame rates on a single-frame-rate display. The key component of our technique is a set of simple yet powerful filters calibrated and evaluated in psychophysical experiments. By varying their parameters we can achieve an impression of continuously varying presentation frame rate in both the spatial and temporal dimensions. This allows artists to achieve the best balance between the aesthetics and the objective quality of the motion picture. Furthermore, we show how our technique, informed by cinematic guidelines, can adapt to the content and achieve this balance automatically.


    1. Agrawal, A., Veeraraghavan, A., and Raskar, R. 2010. Reinterpretable imager: Towards variable post-capture space, angle and time resolution in photography. Computer Graphics Forum 29, 2, 763–772.Google ScholarCross Ref
    2. Brostow, G. J., and Essa, I. 2001. Image-based motion blur for stop motion animation. In Proc ACM SIGGRAPH, 561–566. Google ScholarDigital Library
    3. Brownlow, K. 1980. Silent films: What was the right speed? Sight and Sound, Summer, 164–167.Google Scholar
    4. Brox, T., Bruhn, A., Papenberg, N., and Weickert, J. 2004. High accuracy optical flow estimation based on a theory for warping. In European Conference on Computer Vision (ECCV), vol. 3024 of Lecture Notes in Computer Science, 25–36.Google Scholar
    5. Chai, J.-X., Tong, X., Chan, S.-C., and Shum, H.-Y. 2000. Plenoptic sampling. In Proc ACM SIGGRAPH, 307–318. Google ScholarDigital Library
    6. Daly, S., Xu, N., Crenshaw, J., and Zunjarrao, V. J. 2014. A psychophysical study exploring judder using fundamental signals and complex imagery. In SMPTE Annual Technical Conference.Google Scholar
    7. Daly, S. 1998. Engineering observations from spatiovelocity and spatiotemporal visual models. In Human Vision and Electronic Imaging III, SPIE Vol. 3299, 180–191.Google ScholarCross Ref
    8. Davis, J., Hsieh, Y.-H., and Lee, H.-C. 2015. Humans perceive flicker artifacts at 500 Hz. Scientific Reports 5.Google Scholar
    9. de Lange, H. 1958. Research into the dynamic nature of the human fovea – Cortex systems with intermittent and modulated light. I. Attenuation characteristics with white and colored light. J. Opt. Soc. Am. 48, 11, 777–783.Google ScholarCross Ref
    10. Didyk, P., Eisemann, E., Ritschel, T., Myszkowski, K., and Seidel, H.-P. 2010. Perceptually-motivated real-time temporal upsampling of 3D content for high-refresh-rate displays. Computer Graphics Forum (Proc. Eurographics 2010) 29, 2, 713–722.Google Scholar
    11. Disney Research, 2015. Lucid Dreams of Gabriel. http://www.disneyresearch.com/luciddreamsofgabriel/flowTime.html.Google Scholar
    12. Feng, X.-F. 2006. LCD motion blur analysis, perception, and reduction using synchronized backlight flashing. In Human Vision and Electronic Imaging XI, SPIE Vol. 6057, M1–14.Google Scholar
    13. Fuchs, M., Chen, T., Wang, O., Raskar, R., Seidel, H.-P., and Lensch, H. P. 2010. Real-time temporal shaping of high-speed video streams. Computers & Graphics 34, 575–584. Google ScholarDigital Library
    14. Gupta, M., Agrawal, A., Veeraraghavan, A., and Narasimhan, S. G. 2010. Flexible voxels for motion-aware videography. In Computer Vision — ECCV 2010, K. Daniilidis, P. Maragos, and N. Paragios, Eds., vol. 6311 of Lecture Notes in Computer Science. 100–114. Google ScholarDigital Library
    15. Hoffman, D. M., Karasev, V. I., and Banks, M. S. 2011. Temporal presentation protocols in stereoscopic displays: Flicker visibility, perceived motion, and perceived depth. J Soc Inf Disp 19, 3, 271–297.Google ScholarCross Ref
    16. Hummel, R., Ed. 2002. American Cinematographer Manual (8th Edition). ASC Holding Corp.Google Scholar
    17. Kalloniatis, M., and Luu, C. 2009. Temporal resolution. http://webvision.med.utah.edu/temporal.html.Google Scholar
    18. Mäkelä, P., Rovamo, J., and Whitaker, D. 1994. Effects of luminance and external temporal noise on flicker sensitivity as a function of stimulus size at various eccentricities. Vision Research 34, 15, 1981–91.Google ScholarCross Ref
    19. Navarro, F., Castillo, S., Serón, F. J., and Gutierrez, D. 2011. Perceptual considerations for motion blur rendering. ACM Trans. Appl. Percept. 8, 3, 20:1–20:15. Google ScholarDigital Library
    20. Quesnel, D., Lantin, M., Goldman, A., and Arden, S., 2013. An exploration into the creation of variable frame rate (VFR) stereoscopic 3D narrative productions. Emily Carr University of Art and Design. http://research.ecuad.ca/s3dcentre/projects/hfr-high-frame-rate-research/, accessed Apr 27, 2015.Google Scholar
    21. Raskar, R., Agrawal, A., and Tumblin, J. 2006. Coded exposure photography: Motion deblurring using fluttered shutter. ACM Trans. Graph. 25, 3, 795–804. Google ScholarDigital Library
    22. Richards, D. 2014. 120 fps as a universal production format for motion pictures. In SMPTE Annual Conference Abstracts. http://edas.info/p17109, accessed Jan 6, 2015.Google Scholar
    23. Salt, B. 2009. Film Style and Technology: History and Analysis. Starword.Google Scholar
    24. Samuelson, D. 2014. The mathematics of cinematography. In Hands-on Manual for Cinematographers. CRC Press.Google Scholar
    25. Stengel, M., Bauszat, P., Eisemann, M., Eisemann, E., and Magnor, M. 2015. Temporal video filtering and exposure control for perceptual motion blur. Visualization and Computer Graphics, IEEE Transactions on 21, 2, 1–11.Google Scholar
    26. Stich, T., Linz, C., Wallraven, C., Cunningham, D., and Magnor, M. 2008. Perception-motivated interpolation of image sequences. In Proc. APGV, 97–106. Google ScholarDigital Library
    27. Stump, D. 2014. Frame rates and aspect ratios. In Digital Cinematography: Fundamentals, Tools, Techniques, and Workflows. Focal Press, Burlington, MA.Google Scholar
    28. Sung, K., Pearce, A., and Wang, C. 2002. Spatial-temporal antialiasing. IEEE Transactions on Visualization and Computer Graphics 8, 2, 144–153. Google ScholarDigital Library
    29. Taguchi, Y., Agrawal, A., and Tuzel, O. 2012. Motion-aware structured light using spatio-temporal decodable patterns. In Computer Vision — ECCV 2012, A. Fitzgibbon, S. Lazebnik, P. Perona, Y. Sato, and C. Schmid, Eds., vol. 7576 of Lecture Notes in Computer Science. 832–845. Google ScholarDigital Library
    30. Tambe, S., Veeraraghavan, A., and Agrawal, A. 2013. Towards motion aware light field video for dynamic scenes. In Computer Vision (ICCV), 2013 IEEE International Conference on, 1009–1016. Google ScholarDigital Library
    31. Telleen, J., Sullivan, A., Yee, J., Wang, O., Gunawardane, P., Collins, I., and Davis, J. 2007. Synthetic shutter speed imaging. Comput. Graph. Forum 26, 3, 591–598.Google ScholarCross Ref
    32. Tessive, 2014. Time Filter, Time Shaper. http://tessive.com.Google Scholar
    33. Trumbull, D. H., and Jackson, B., 2013. Method and apparatus for photographing and projecting moving images. US Patent US8363117 B2.Google Scholar
    34. Vangorp, P., Chaurasia, G., Laffont, P.-Y., Fleming, R., and Drettakis, G. 2011. Perception of visual artifacts in image-based rendering of façades. Computer Graphics Forum (Proc. of the EGSR) 30, 4, 1241–1250. Google ScholarDigital Library
    35. Watson, A. B. 2013. High frame rates and human vision: A view through the window of visibility. SMPTE Motion Imaging Journal 122, 2, 18–32.Google ScholarCross Ref
    36. Westerink, J. H., and Teunissen, K. 1995. Perceived sharpness in complex moving images. Displays 16, 2, 89–97.Google ScholarCross Ref
    37. Wilcox, L. M., Allison, R. S., Helliker, J., Dunk, B., and Anthony, R. C. 2015. Evidence that viewers prefer higher frame-rate film. ACM Trans. Appl. Percept. 12, 4, 15:1–12. Google ScholarDigital Library
    38. Zwicker, M., Matusik, W., Durand, F., Pfister, H., and Forlines, C. 2006. Antialiasing for automultiscopic 3D displays. In Proc. EGSR, 73–82. Google ScholarDigital Library

ACM Digital Library Publication: