Luma AI, a startup specializing in AI graphics, presents an impressive new feature: 3D flythroughs generated from 2D video that resemble professional drone flights.
First, the app generates a 3D environment based on a regular video. According to Luma AI, this is done using “the world’s most advanced 3D generative NeRF technology”. Neural radiance fields (NeRFs) are 3D neural representations of scenes or objects captured in 2D. They can be generated in seconds or minutes without any special data training.
Drone flight without a drone
The video-to-3D feature is not new; Luma introduced it back in March. What is new is “3D Camera Path AI,” which allows users to fly freely through the 3D model, just like flying a drone. The virtual flight path is simply set in the app. Luma AI has not yet revealed any technical details about 3D Camera Path AI.
Such virtual drone flights were already possible, but you had to switch to AR mode and move the smartphone through the scene or define keyframes in a more complex editor.
Luma AI has now simplified the flight function into a new iPhone app called “Luma Flythroughs,” which it is marketing explicitly to the real estate industry, but which anyone can download from the App Store. According to Luma AI, the app can replace expensive professional interior shots for real estate sales.
In the first promotional trailer and on its website, Luma AI shows high-quality examples of flythroughs generated with the app.
Some 3D scenes still have visual glitches. This is especially true for scenes with a lot of detail. Perhaps these glitches could have been avoided with a little more time and care in the production process. Note that shooting outside is possible.
In general, the higher the quality of the source material, the more detailed and flawless the 3D scene will be. According to Luma, casually shot footage in good lighting conditions is sufficient. Special tools such as stabilizers should not be necessary to create a high-quality flythrough video.