Luma Labs released Dream Machine last week and the AI startup is already rolling out its first round of upgrades, including the ability to extend a clip by five seconds.
Dream Machine can generate photorealistic video and accurate real-world motion in a way we’ve only seen so far from the closed OpenAI Sora model and the Chinese Kling AI.
I tested Dream Machine when it was first released and it felt like a first step. An impressive output (after a 12-hour wait due to demand) but in need of a few UI additions and it seems that is already happening.
The first updates have gone live including clip continuation and the ability to more easily download a video you created. Pro users can also remove the watermark.
Dream Machine Extend launching today
Being able to extend a clip was likely one of the first updates Luma wanted to launch, and this is already available for users of the platform. Each generation uses up one of your monthly allocation — which varies depending on subscription.
Coming soon to Dream Machine - powerful editability and intuitive controls! Here's a sneak-peak of something exciting we are working towards. #LumaDreamMachine Sign up for early access and to be the first to know when it is ready: https://t.co/UMuEr8tMu7 pic.twitter.com/qrgPC0Nrl4June 17, 2024
Competitors like Pika Labs and Runway have had the ability to extend a video from day one but that often comes with limited success as the longer the video gets the more distorted or different it becomes. Luma promises it will be different.
I haven't personally tried it yet, although I have put in a prompt to extend an existing clip, it just hadn't returned at the time of posting this story. The process is simple, you click extend on a clip, give it a fresh motion prompt and it moves it on.
"Extend is an advanced system that is aware of what's happening in your video and extends it in a consistent way to follow instruction," the company wrote on X.
Discovery and editing coming soon
Luma says we can also expect to see a new discovery feature coming to the interface in the future. This will allow you to explore different video concepts and ideas.
One of the most promising new features is in-video editing. This isn't live but is where you can change backgrounds and foregrounds on the fly in any generated video. For example you could replace one character with another or put them in a new location.
Today, we are releasing Extend video. Extend is an advanced system that is aware of what's happening in your video and extends it in a consistent way to follow instruction. Share your extended creations (>10s) as a quote to this tweet and with #LumaDreamMachine and the top 10… pic.twitter.com/GqC7KjTcOlJune 17, 2024
This is similar to inpainting in AI image generation and close to a feature already available in Pika Labs. Having not tried it I can't comment on the comparison more than to just point out the concept is already available.
All we've got as a demo for this feature is a video clip but in it you can see new context menu options when you play the video. This includes ways to change the background of the video you've created.