Download Pack
This pack contains 83 VJ loops (90 GB)
https://www.patreon.com/posts/75359952
Behind the Scenes
Years ago I read a short story about morphing chrome on the horizon of the future. It was clearly visible as something approaching at great speed, yet something distant and vague, beckoning us with multifaceted promises, and sharp with vicious nightmares. I always thought it was an evocative metaphor for technology that is increasingly pervasive in our lives. The idea really made an impression on me and so I wanted to finally visualize it.
Some of my very first experiments with DallE2 was creating images of chrome metal in all sorts of mangled futuristic shapes. So when I was looking through some old experiments I stumbled across these early experiments and knew it was something I should revisit. DallE2 has a unique style that often looks different from Stable Diffusion and Midjourney, particularly with chrome metal, and so I decided to continue using DallE2 even though it's a pain to manually save each image. I generated 1,324 images in total using DallE2. If the DallE2 devs are reading this, please add a method to batch output. In the future I want to explore using Stable Diffusion to generate variations based on a image and see how that goes.
Then I took all of the DallE2 images and retrained the FFHQ-512x512 model using StyleGAN2. I also tested out retraining FFHQ-1024x1024 but it's slower and I had trouble getting it to converge due to the gamma attribute being delicate. So I stuck with 512x512 for the model.
The battle continues between the ScaleUp AE plugin and Topaz Labs Video Enhance AI. In this specific context, the ScaleUp AE plugin produced more desirable results when using the "sharp" setting, which significantly increases the render time to 1.2 seconds per frame. Ouch! So these were some heavy renders, but the "sharp" setting conjures up some unique details and I feel it's worth the added render time. I'm very curious to see how these uprez tools continue to mature in the future. Already Stable Diffusion 2 has added uprez capability for images and it'll be interesting to see if the AI models behind these video uprez tools can be trained to better understand all sorts of subjects.
Had lots of fun doing compositing experiments in After Effects. In the last few pack releases I've realized the that removing specific colors and then purposefully blowing out the remaining colors often takes the video to a whole new level. So I've started exploring a bunch of effects that I normally wouldn't touch. For instance, when Time Difference is applied to a layer, it typically looks not very interesting, but it looks great if I remove specific colors via Color Range and add some Deep Glow. I did some similar experiments with using the Exclusion blend mode and created a glitchy iridescent shimmery look. In other experiments tinted the metal a cool blue color and made the highlights glow some various colors, which was tricky to pull off in the way I was imagining. Experimented with the Limiter effect to treat the perfect white highlights in a unique way. And I just cannot stay away from the wonderfully bizarre Slitscan effect.
With all of these amazing DallE2 images sitting unused after the StyleGAN2 training, I wanted to give them a new life but I just felt stumped. I finally landed on treating all of the images as a sequential frame sequence in AE, applying the Echo effect to have them automatically overlap, and then playing with Pixel Sorter 2, RSMB, and tons of Deep Glow. For the "EchoMinStep2_FakeMotionBlur" video I push the RSMB fake motion blur setting to the absolute max and I think it has a electric feeling that works great with intense music.
Recently I realized that I had given up adding motion blur via ReelSmart Motion Blur since it cannot be forced to work on slow moving footage, which I admit is an unusual request. But seeing as how these videos are designed to be played at 200% or 400% of their original speed... Why not just render out a version that is sped up on purpose? I just needed to speed up the footage in AE, pre-comp it, and then RSMB could work its magic. Such a simple solution that I didn't think of before.
Doing some displacement map experiments in Maya/Redshift produced some tasty results. The displacement map brings out some interesting details and plus the brute force global illumination might be a bit noisy at lower settings but it still looks great. Someday I'll upgrade my GPU and go wild on those GI settings.
I'm growing frustrated with the color banding that happens in encoded videos with dark scenes and lots of subtle gradients from glowing FX. So I did some tests with rendering out 10-bit H264 and H265 MP4 videos from After Effects. They played back in VLC just fine and the gradients were so lush! But when I dragged the 10-bit H264 MP4 into Resolume 7.13.2 then it froze up until I force quit it. And the 10-bit H265 MP4 wasn't even recognized by Resolume. So it doesn't look like 10-bit videos are supported in Resolume yet, which is a drag. I cannot find any Resolume documentation on the subject either. Yet it's without a doubt the most popular VJ software currently and so I gotta support it. Although I'm not entirely sure how much color banding matters when projected due to light bouncing around and washing out the image, but I think it does matter for LED screens which are popping up at concerts with more frequency. Something to revisit in the future. Mirrorshades!
Discussion (0)