AI fashion show using dall-e to generate hundreds of outfits of the day after tomorrow. An interesting way to brainstorm costume and fashion design ideas.
Collab with
@shyamagolden
@OpenAI
#dalle2
#dalle
#aiart
Using
@OpenAI
dall-e 2 AI to dream up what a Chinese restaurant skyscraper might look like. The fluid frame interpolation was made with
@runwayapp
new super slow motion setting that also uses the power of AI to morph between frames.
#aiart
#ai
#architecture
#vfx
#dalle
Washed Out "The Hardest Part"
I leaned into the hallucinations, the strange details, the dream-like logic of movement, the distorted mirror of memories, the surreal qualities unique to Sora / AI that differentiate it from reality. Embrace the strange.
“The Simpsons” but make it an experimental cubist stop motion. Felt right given long tradition of reanimating the Simpsons intro. Created with the spellbindingly addictive
#Gen1
AI video generator from
@runwayml
— still early days but step in the future if
#animation
#ai
#aiart
Using dall-e 2 to manifest what’s not real. Another test combining dalle stop motion with moving video. This time blending ai generated objects with a real object
#dalle
#openai
#experiment
#dalle2
#aiart
NeRFs of the Louvre made from a handful of short iPhone videos shot during a location scout last month. Each shot reimagined over a month later. The impossibilities are endless. More to come…
Made with
@LumaLabsAI
.
#nerf
#vfx
#aiart
#shotoniphone
Impossible shot I’ve dreamed of for years. Made with 13 separate drone takes composited into one made possible with
@DJIGlobal
new inspire 3 “motion control” system. Thanks to
@dronedudes
and the human motion control
@hokutokonishi
First official commissioned music video made with
@OpenAI
Sora for
@realwashedout
This was an idea I had almost 10 years ago and then abandoned. Finally was able to bring it to life.
Watch the full video here
Made with Sora. The Golden Record - from raw earth material to a time capsule of human life on Earth. Using 11 different generations cut together from Sora, I was able to explore what the odyssey of this record might look like.
@OpenAI
Tempting to use AI for everything but drawing my own storyboards has always been part of my process. “Absolve” had almost 70 shots to get at the Louvre in 8 hours so had to meticulously organized. Even with the rapidly AI vfx much of the original intent carried through to the end
Exploring Space / Time with Sora. This isn’t going to replace the filmmaking process, rather it’s offering an entirely new way of thinking about it. Not restricted by time, money or other people’s permission, I can ideas and experiment in bold and exciting ways
Made with
@OpenAI
OpenAi's Sora can do a lot more than just generate text to video. This example shows an objectively beautiful morph between one subject into another. The VFX implications here are staggering
“Thank You For Not Answering” short film written by me and “co-directed” with
@runwayml
#gen2
. Made from images+text 2 video. While it’s not quite reality, it presents us with an entirely new aesthetic. There is beauty in the imperfections.
#aiart
#AI
#aivideo
#filmmaking
The you you are today is not the you you were yesterday.
Another week, another AI experiment using
#dalle2
and
@runwayml
This time AI was used to animated face masks, generate jewelry and create organic rock formations
#aiart
#vfx
#ai
Where do we evolve from here? Using
#dalle2
in painting onto video frames to generate objects on the leaf. No photoshop. No photos. All AI.
AI was used to generate the voice of David Attenborough. Written with some help from
@MissyElliott
#AIart
#dalle
@OpenAI
“Absolve” VFX Breakdown. This film was a deeper exploration of the AI VFX techniques I have been developing over with the goal was to see if AI FX could be used in a more cinematic narrative setting. Made with stable diffusion
@runwayml
@LumaLabsAI
@AdobeAE
#ai
#vfx
#breakdown
Spinning into summer’s embrace with the amazing moves of
@hok
and Erica Klein
ComfyUI Animation lead
@makeitrad1
Made with support by Lenovo
More soon...!
Resurrecting old ideas with Sora — things that otherwise would have never seen the light of day. Going through the archive of unproduced music video pitches, I can now see what they might have looked like. It’s like viewing an alternate timeline.
Made with
@OpenAI
#sora
Beyond fascinated by whatever voodoo
@runwayml
#Gen2
is conjuring. These are some outtakes from a short I'm releasing next week. Simulating water is some processor-intensive stuff. Thats why generating will replace rendering in future VFX
#ai
#aiart
#generativevideo
#generativeai
Finally got to see Michaelangelo's David in Florence and rather than just take a photo like normal person, I spent 20 minutes walking around it capturing every angle looking like an insane person. It's hard to look cool when making a
#NeRF
but damn it looks cool later
@LumaLabsAI
How to evolve a building with AI… Using
#dalle2
in combination with
@runwayml
roto and slow-mo tools to “animate” without actually having to animate. Dall-e creates the keyframes and Runway does the tweening by using AI enhanced frame interpolation.
#aiart
#vfx
#breakdown
Moving on is the hardest part. This is about losing a loved one and dreaming of them after they're gone. But memories are subjective, distorted mirrors of reality. That's what's in the latent space.
The Hardest Part by
@realwashedout
Made with Sora
“An Uncanny Hall Of Mirrors” A completely synthesized reality made with
@runwayml
beta of
#gen2
… and wow… it kinda rattled me to my core. All you need is an image or a text prompt - still a very early version but another paradigm shift in the world of AI filmmaking.
#aiart
#ai
Can AI create better VFX? Lots of VFX don’t look great because they don’t know what they’re lighting to on set. Using a variety of AI tools, we can now move fluidly between pre and post. BGs made with stable diffusion, Photoshop Gen Fill, Magnific, Krea and Topaz and Runway Gen-2
What will TED look like in 40 years? For
#TED2024
, we worked with artist
@PaulTrillo
and
@OpenAI
to create this exclusive video using Sora, their unreleased text-to-video model. Stay tuned for more groundbreaking AI — coming soon to !
When crafting a shot, often you’re working right up to the edge of frame. Everything out of frame is pure chaos. Using
#dalle2
out-painting to break the fourth wall for the endings of films. Turns out Dall-e is the perfect vfx tool for doing set extension.
#ai
#aiart
#dalle
#zoom
The power to dream whatever you wish — transforming some plain ol’ video footage into a cloudy dream using
#VFX
and
#AI
tools powered by the Dell Precision 7770 Workstation and NVIDIA Studio’s creative ecosystem.
#aiart
#surreal
#animation
Got to play with
@runwayml
new
#Gen1
AI video generator that allows you to take an input video and transform it infinitely. A game changer for VFX down the line. This is composited with the original footage plus 3 Gen1 renders. No more stock elements!
#ai
#aiart
#vfx
The possibilities haven’t been fully realized yet. An attempt to make
#nerf
more cinematic. This will have a huge impact on VFX going forward. Shot on an iPhone made with
@LumaLabsAI
#aiart
“ABSOLVE” by Jacques is on
@vimeo
Staff Picks. A trip through the Louvre where a cosmic teardrop reveals the agony underneath of these historical artworks. Follow your tears and see where it might take you
Watch
#aianimation
#ai
#vfx
#stablediffusion
It all blends together after the first 6. Made earlier this year all within
@runway
using inpainting and frame interpolation - twist on my first ai technique
Every bit of this animation is generated from
@runwayml
#gen2
using input images and text but the results are always a bit of a surprise. Still believe traditional animation still has a bright future. It will only be amplified by being able to work smarter
#ai
#aiart
#animation
Wow been keeping this one under wraps for a min. New film I directed for
@gofundme
. An incredible amount of work went into developing the workflow to create a 2 minute living painting-blending live action and animation. Wouldn’t have been possible just 6 months ago
#ai
#animation
When the world needed kindness, you were there. Time and time again, donation by donation.
Here’s a video to show the power of all your help. Thank you for donating.
Learn more:
In a constant state of ascension, our past selves are left behind and our new self is made of infinite synthesis.
Featuring
@_MrOx_
#vfx
#runway
#videoart
1 year ago made the first stop motion AI VFX shot using dall-e inpainting. This technique inspired many to use the tool in a new way and it expanded my mind as to what the future of VFX would look like. Been trapped in the rabbit hole ever since.
Another shot testing
@DJIGlobal
inspire 3 “motion control” drone system. This is a shot a motion control arm could never achieve - maximum height is always an issue. But the real magic is
@hokutokonishi
ability to control his wall cycle in sync with the drones take off
Experimenting with
#img2img
using Muybridge’s first film of a black man riding a horse. As the
#ai
attempted to increase the quality I was intrigued by what was lost. Entering this new realm of image-making, let’s consider what we’re adding and not overlook of what came before
AI is ruining art. “Absolve” follows an out of control teardrop in the Louvre and tries to make crying in public look cool!
Liquid created with after effects and
#stablediffusion
img2img. No 3D. An alternative to how we approach VFX
Watch:
#aiart
#vfx
Shot breakdown from “Absolve” using
@runwayml
#gen2
to generate “3D” asset/ then comped into the shot and ran through stable Diffusion
#img2img
- experimental workflow for now but glimpse into a future where generating will replace rendering
#ai
#aiart
#vfx
#aianimation
Sora is at its most powerful when you’re not replicating the old but bringing to life new and impossible ideas we would have otherwise never had the opportunity to see.
Check out more of my experiments on the blog post
AI is going to change VFX. This is a silly little experiment but it shows how powerful dall-e 2 is in generating elements into a pre existing video. These tools will become easier to use so when spectacle becomes cheap, ideas will prevail
#aiart
#dalle
#ufo
@openaidalle
#dalle2
An infinite pan featuring my talented wife
@shyamagolden
and where she gets her inspiration. Stitching two different handheld shots together use
#dalle2
What a year. What is a year? A year gets more abstract every year. Here is a lot of the non abstract things that concretely happened. A lot happened, a lot didn’t happen but we will never know what that was. Thanks to everyone who made this less abstract of a year.
“Looking In, Looking Out” look out for it coming online tomorrow. 4 years ago, over 50 people sent me videos of what they were doing during the tragedy and mundanity of the 2020 lockdown. All the self taped videos were stitched together into video quilt of a single place and time
This is the video inpaintjng I’ve been waiting for. Nathan and I have been playing with some new techniques using
#animatediff
and video input with moving masks… get ready
"NOTES TO MY FUTURE SELF" VFX breakdown combining a variety of AI tools that shows the fluidity from pre-pro to post. The stable diffusion image became references for how to light the actors on set. The final BGs were upscaled, expanded and then injected with motion using Gen-2
Sometimes you wish they would just take you away.
Shot by Patrick Jones
Featuring Steve Nwachukwu
Music by: Found Objects
Composer: Louis Weeks
No green screen, no turntable platform, no wire rig. 😉 AI tools powered by
@runwayapp
#vfx
Eery one of the 80 VFX shots in “Absolve” use generative AI. However traditional vfx are still needed for control — the team at PPVFX handled the Houdini simulation which were processed through
#stablediffusion
#ing2img
and ebsynth like a render engine.
#vfx
#ai
#aiart
#film
Outfit swapping is an effect I’ve done in my work a few different ways. Al has made this digital effect much easier and more complex but doesn't replace the need and quality of practical effects. Nor does it replace actual designers
Outfit swapping is an effect I’ve done in my work a few different ways. Al has made this digital effect much easier and more complex but doesn't replace the need and quality of practical effects. Nor does it replace actual designers
“Absolve” featuring
@yocestjacques
premieres tomorrow. A journey through the emotional realm of the Louvre where AI destroys historical artwork. Never thought they would allow me into the museum but here we are! Maybe the first
#AI
project to be printed to 16mm film?
#aiart
#vfx
To create this, the shot had to be tracked in 3D to create a moving mask that matches the camera move. The frames are exported out as pngs with transparency. Dall-e then fills in the gap of the transparent hole. The images are sequenced together and slowed down
For the second year in a row I had the pleasure of being on the jury for the Runway film festival… the progress of both technology and ideas has been exciting to watch. I will also be speaking May 1st screening in LA. Come say hey. Ticket info below…
Ten films. Special presentations. And a panel all about the persistence of story.
Join us in exactly one week for the second annual AI Film Festival in Los Angeles.
Select seats still available, request to attend at
Another behind the scenes look at
@gofundme
“Help Changes Everything” video - side by side comparison of the live action vs the composited animation and AI
#img2img
process. Featuring my doge Pepper who we converted into a German Shepard
#bts
#stablediffusion
#ai
#aiart
#vfx
In one week “Absolve” a new film with
@yocestjacques
will premiere. Shot on location at
@MuseeLouvre
and pushing AI visual effects to new heights. Stay tuned…
Life imitating art imitated by an AI. The oil paintings of
@shyamagolden
made into a custom
#stablediffusion
model. Excited by the possibilities to turn her art into motion.
Masks made in Sri Lanka by a family business: ceylonese boutique on Etsy
#videoart
#experimental
#aiart
The frame sequence is then uploaded to
@runwayml
- using their new super slow motion feature, I slowed the frame sequence down to 25% speed to force a fluid interpolation motion to smooth out the raw jittery stop motion
Last night I picked up the Gold Nica at the
@ArsElectronica
award ceremony for the Washed Out “The Hardest Part” music video. First award for use of AI in the 37 year history of the festival based electronic and computer generated art