Live video broadcasting requires a multitude of professional expertise to enable multi-camera productions. Robotic systems allow the automation of common and repeated tracking shots. However, predefined camera shots do not allow quick adjustments when required due to unpredictable events.
We introduce a modular automated robotic camera control and video switch system, based on fundamental cinematographic rules. The actors’ positions are provided by a markerless tracking system. In addition, sound levels of actors’ lavalier microphones are used to analyze the current scene. An expert system determines appropriate camera angles and decides when to switch from one camera to another (This was implemented in Unity3D). A test production was conducted to observe the developed prototype in a live broadcast scenario and served as a video demonstration for an evaluation.
This project was published in Proceedings of the ACM International Conference on Interactive Experiences for TV and Online Video 2016 (http://dl.acm.org/citation.cfm?id=2933559, sponsored by Samsung) as one out of 44 selected contributions in 168 submissions (Overall Acceptance Rate: 26%).