Photographers looking to capture the perfect shot might soon be able to call on some unlikely helpers: a swarm of small robot helicopters.
Researchers from the Massachusetts Institute of Technology (MIT) and Cornell University are experimenting with a new autonomous drone, equipped with a light to create special effects during photo shoots. The drone, they say, could help photographers achieve difficult lighting effects more easily than with conventional lighting systems.The first version of this flying flashbulb will debut in August, at the 10th annual International Symposium on Computational Aesthetics in Graphics, Visualization and Imaging, in Vancouver, British Columbia. [5 Surprising Ways Drones Could Be Used in the Future]
At the conference, researchers said the drone will produce a particularly difficult effect known as "rim lighting," in which only the edge of a photographer's subject is strongly lit.
"[Rim lighting is] very sensitive to the position of the light," Manohar Srikanth, a senior researcher at Nokia who worked on the drone as a graduate and postdoctoral student at MIT, said in a statement. "If you move the light — say, by a foot — your appearance changes dramatically."
The newly developed system allows photographers to input the direction from which they want the rim light to come, as well as the width of the desired rim, or how much of the subject should be lit. Thedrone then flies itself to the proper side of the subject and maintains the specified rim width.
"If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he's looking 90 degrees away from you, then he's exposing his chest to the light, which means that you'll see a much thicker rim light," Srikanth said. "So, in order to compensate for the change in the body, the light has to change its position quite dramatically."
The handy drone can also adjust itself based on the photographer's movement. The robotic flier uses control signals from the photographer's camera to determine how to position itself.
These control signals are emitted from the camera about 20 times a second, with the camera producing an image that, rather than being stored in the camera's memory, is transmitted to a computer. The computer runs an algorithm, created by the researchers, that constantly evaluates the rim width and adjusts the drone's position accordingly.
Frédo Durand, one of the project's researchers and a professor of computer science and engineering at MIT, said that this self-correction feature was the most difficult part of the drone project.
"The challenge was the manipulation of the very difficult dynamics of the [drone] and the feedback from the lighting estimation," Durand said. "That's where we put a lot of our efforts, to make sure that the control of the drone could work at the very high speed that's needed just to keep the thing flying and deal with the information from the [drone's laser range finder] and the rim-lighting estimation."
The prototype drone performed well in the motion-capture studio where it was tested, Srikanth said. However, making the drone robust enough to serve as a photographer's assistant in the real world may be trickier.
But overcoming such challenges should be possible, given the rapid advancements in robotics and related technologies, said Ravi Ramamoorthi, a professor of computer science and engineering at the University of California, San Diego.
Follow Elizabeth Palermo on Twitter @techEpalermo, Facebook orGoogle+. Follow Live Science @livescience. We're also on Facebook &Google+. Original article on Live Science.
"[Rim lighting is] very sensitive to the position of the light," Manohar Srikanth, a senior researcher at Nokia who worked on the drone as a graduate and postdoctoral student at MIT, said in a statement. "If you move the light — say, by a foot — your appearance changes dramatically."
The newly developed system allows photographers to input the direction from which they want the rim light to come, as well as the width of the desired rim, or how much of the subject should be lit. Thedrone then flies itself to the proper side of the subject and maintains the specified rim width.
"If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he's looking 90 degrees away from you, then he's exposing his chest to the light, which means that you'll see a much thicker rim light," Srikanth said. "So, in order to compensate for the change in the body, the light has to change its position quite dramatically."
The handy drone can also adjust itself based on the photographer's movement. The robotic flier uses control signals from the photographer's camera to determine how to position itself.
These control signals are emitted from the camera about 20 times a second, with the camera producing an image that, rather than being stored in the camera's memory, is transmitted to a computer. The computer runs an algorithm, created by the researchers, that constantly evaluates the rim width and adjusts the drone's position accordingly.
Frédo Durand, one of the project's researchers and a professor of computer science and engineering at MIT, said that this self-correction feature was the most difficult part of the drone project.
"The challenge was the manipulation of the very difficult dynamics of the [drone] and the feedback from the lighting estimation," Durand said. "That's where we put a lot of our efforts, to make sure that the control of the drone could work at the very high speed that's needed just to keep the thing flying and deal with the information from the [drone's laser range finder] and the rim-lighting estimation."
The prototype drone performed well in the motion-capture studio where it was tested, Srikanth said. However, making the drone robust enough to serve as a photographer's assistant in the real world may be trickier.
But overcoming such challenges should be possible, given the rapid advancements in robotics and related technologies, said Ravi Ramamoorthi, a professor of computer science and engineering at the University of California, San Diego.
Follow Elizabeth Palermo on Twitter @techEpalermo, Facebook orGoogle+. Follow Live Science @livescience. We're also on Facebook &Google+. Original article on Live Science.
No comments:
Post a Comment