CineMPC in action. This figure depicts two experiments in which CineMPC manages both intrinsic and extrinsic elements of the camera at the user’s direction. The first row presents the initial experiment, where CineMPC manages both intrinsic and extrinsic elements to capture a target (the chair), adhering to the rule of thirds and maintaining focus. In the second row, two images from the second experiment demonstrate the registration of two targets (car and bottle) meeting various image focus requirements (focus first on the bottle then on the car) and positioning the targets in the image (focusing them in accordance with the rule of thirds). The drone remains motionless for the duration of this experiment. Thus, all recording changes are carried out through manipulation of the camera’s focal length and focus distance, its intrinsic elements. Credit: Pueyo et al.
Recent technological advances, such as increasingly sophisticated drones and cameras, have opened up exciting new possibilities for filmmaking. Most notably, directors can now shoot scenes from a wide range of previously unattainable angles and at much higher resolution.
Researchers from the University of Zaragoza and Stanford University recently developed CineMPC, a new cinematic system that relies on a fully autonomous drone carrying a cinematic camera to film multiple targets autonomously, while following instructions of the director. The platform modulates different drone and camera parameters to satisfy these instructions. The team’s innovative system, described in IEEE Transactions on Roboticscould bring a wave of innovation to the film industry and other sectors that can benefit from high-quality video footage.
“Existing solutions for autonomous drone cinematography revealed a common oversight, namely that none provided automatic control of intrinsic camera parameters (i.e. focal length, aperture, distance (focus),” Pablo Pueyo Ramon, co-author of the paper, told Tech Xplore.
“In cinematography, the control of these parameters is essential to achieve different artistic and technical objectives, such as the desired depth of field (parts of the scene in focus or blurred), or iconic shots like the Dolly-zoom or the effect CineMPC fills this gap, by autonomously determining the appropriate camera intrinsics capable of carrying out a wide range of user-defined cinematic instructions.
Pueyo and his colleagues have been developing innovative technologies for cinema for some time now. The new system they developed, called CineMPC, was first introduced in 2021.
CineMPC is essentially software that can be installed in any drone equipped with a controllable professional camera (e.g. a DSLR camera). As part of their recent study, the researchers expanded the functionality of their software and set out to evaluate its performance in a real-world context.
“In addition to including a better and more advanced control strategy, the new version of CineMPC also includes a perception module capable of identifying relevant scene information, such as actors and actresses, making it a solution for 100% autonomous control,” Pueyo explained. “Finally, we are now releasing the source code so everyone can use it.”
Pueyo and his colleagues tested the improved version of CineMPC by applying it to a real drone for cinematography and filming various scenes with it. They found that their software achieved remarkable results, reliably estimating the relative poses of filmed targets and allowing users to better control the images captured by the drone, for example by adding unique effects, tracking people or objects specific, etc.
“In our opinion, the implications of our study for cinematography alone are remarkable,” Pueyo said. “We are extremely excited to offer filmmakers creative freedom, improved security and increased autonomy for real-time decision-making.”
The updated version of CineMPC is already easily accessible online to developers, filmmakers and cinematographers. Thus, it could soon be used to improve the use of drones for filming films, short films and documentaries.
CineMPC could also prove useful for a variety of other applications, such as allowing environmental scientists and researchers to closely monitor wildlife and natural environments remotely and without disturbing them, by controlling the zoom of drone-equipped cameras. Furthermore, it can also be a promising tool for precision agriculture, as it allows both monitoring of large fields remotely and close inspection of specific plants.
“We are currently working on developing advanced AI techniques to simplify interaction with CineMPC, with the aim of democratizing the use of autonomous drones,” Pueyo added. “Instead of relying on handcrafted input instructions, we plan to develop user-friendly interfaces and intuitive controls.
“For example, we have just published another article, in which the instructions to the drone are defined by a video sequence, for example a small clip from your favorite film. Our solution identifies the information needed to control the drone and camera to reproduce it and CineMPC executes it.”
More information:
Pablo Pueyo et al, CineMPC: a fully autonomous drone cinema system integrating zoom, focus, pose and scene composition, IEEE Transactions on Robotics (2024). DOI: 10.1109/TRO.2024.3353550
Pablo Pueyo et al, CineTransfer: Controlling a robot to imitate cinematic style from a single example, 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2023). DOI: 10.1109/IROS55552.2023.10342280
© 2024 Science X Network
Quote: A fully autonomous drone system for cinematography and wildlife monitoring (February 1, 2024) retrieved February 1, 2024 from
This document is subject to copyright. Except for fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.