Download Pack
This pack contains 170 VJ loops (255 GB)
https://www.patreon.com/posts/132984329
Behind the Scenes
I recommend downloading the MOV files if you want the transparency included in the video clips. Or download the MP4 files if you prefer a black background. The only difference between the MOV and MP4 videos is the alpha channel. Download the project files if you want to edit the animations or jam with the bug images.
Imagine with me a blend of Cthulhu's cousins, the Zerg swarm from Starcraft, and real insects of Earth... I daydream of alien bugs jamming to some hard hitting music. Multi-legged insects reaching out and swiping at the viewer. Surreal buzzing monstrosities curious for just a little tasty bite. I've wanted to work on this idea for over 10 years but have been waiting for the tools to mature... And that time has arrived!
As a proof of concept I started by finding a photo of a praying mantis from Pixabay.com to use as a test, manually cutting it out in Photoshop, laying out the limb rotation axis points in After Effects, and then I tested out the Vybe Motion Synthesizer plugin in After Effects. I had thought this plugin would streamline the animation process by instead relying on procedural animation, but it ended up not doing quite what I had in mind and so I gave up on the plugin. Originally I thought it would be too much trouble to manually animate it by hand, but I slept on it over a few days and realized that I should really give it a try. Plus I've long been curious to explore the various IK rigging plugins that are available for AE and are quite mature. So I tested out a few different IK rigging plugins and chose to run with the Limber 2 plugin since it's so well designed and thought out. After animating with this plugin on the proof of concept, I knew that I’d found a golden workflow. So I reached out to Palpa to see if he was available to help me generate some alien bugs and he was eager to experiment on this highly specific idea. Although I was a little nervous since I'd found LoRAs for Stable Diffusion 1.5 to frequently be overfit on a concept and therefore would not be able to generate a wide array of bug types and wild permutations. But luckily Palpa had been recently making loads of LoRAs for Flux and assured me that Flux was on a whole different level and much better in this regard. And holy smokes was he right!
So I started out by precisely describing to Palpa what I was looking for in this library of alien bugs. I also made a high level list of insect types that I'd like to be able to generate and threw in some crustaceans and aliens for good measure. Ideally with a dramatic overhead lighting, aim for a mix of photorealism and CGI vibes, scientific illustration for the bug poses (shoutout to Ernst Haeckel [1834-1919]), and always against a pure black background. I thought this was a high bar to aim for but Palpa knew just how to cook it up. After seeing some of his other custom LoRAs, we agreed that the Flux model was ideal for this task. So we used Midjourney to generate some images of the different insect types that I had in mind. Then we curated the best 60 images and I cleaned up the images in Photoshop to make the background perfectly black. Palpa then used the Magnific image uprez service on the selected images and then I did another round of cleanup. From there Palpa trained a custom LoRA that would allow the Flux model to go off the rails and generate some really wild alien bugs. I was having trouble with the text prompts adhering and so I used a separate tool that would auto generate a text prompt from a given image and it generated a text prompt that was far more descriptive than I had written up and that helped streamline the generation of different bug types that I had in mind. I also experimented with throwing in some various LoRAs, just to allow for happy accidents during the batch render processing. Then I rendered out 2,678 images using the "Stable Diffusion WebUI Forge" app with the "HiRezFix" enabled (so as to double the resolution from 512x215 to 1024x1024) and utilized a native script that would batch render through a large collection of different text prompts. I also collected together the 255 images generated using Midjourney. So I ended up with a library of 2,933 images in total. Then I curated through all the images and selected only the best 798 images to uprez for collaging purposes. The PhotoSift app was useful for aiding curation, really wish I had this tool long ago.
After that I needed to uprez the 798 images. Although I was very impressed with the results of the Magnific image uprez service, I was not so pleased with the pricing. Credit based subscriptions really kill my creative appetite and tend to stop me from taking risks. I've occasionally experimenting with uprezzing using Forge but found it to be too fussy, particularly when batch rendering. I've long loved using the Topaz Gigapixel app and yet they discontinued the app a few years back. So I headed over to the Topaz website to see what options they had for image uprezzing and was thrilled to see that they had resurrected the Topaz Gigapixel app and were beta testing several new "Generative Models". The "Core Models" are a direct uprez and can only be pushed so far. But the "Generative Models" follow the overall structure of the image and then creates new details where none existed prior, which is absolutely perfect for my needs to render lots of experiments locally. So I imported the images into the Topaz Gigapixel app and used the Redefine model (Creativity: 4 / Text Prompt: "insect") to uprez the images to 2048x2048 so that it could imagine some rich details onto the bugs and also make them super sharp. Finally I further curated only the absolute best images and selected 42 bugs that I would use for the base bodies and collage more legs onto.
With this insane collection of alien bugs in hand, next up was the laborious task of making them into puppets. So I spent a big chunk of time working in Photoshop to cut out the bug limbs and then fill in the missing body area that was behind the limb. Using the Object Selection tool and Generative Fill tool in Photoshop were useful in streamlining this process, but it was still tedious work. It's so useful to have AI "inpainting" within Photoshop that functions regardless of how detailed the alpha input is. Although the Generative Fill generates some very rough alpha edges by default, which is easily fixed, but feels like an oversight in it's implementation. Also the Generative Fill would sometimes generate image variations out of context from my insect framework, which isn't surprising, and yet if I input a text prompt then it would often be much too literal and also ignore the scale. For instance, if I had a selection and wanted it to Generative Fill a beetle body, instead it would generate whole new beetle within this selection area. Tricky issue. And so I would always leave the text prompt empty and just do a few variations gens until I found something that I was happy with. But I recognize I'm working with the tool in a bizarre and highly specific way, and so even still it was was super helpful and I only rarely needed to switch over to the Spot Healing Brush or Clone Stamp Bush. I also cut out about 300 different legs, antenna, claws, mandibles, mouthparts, wings, thorax, and other abstract bits so that I could have a library to collage together and add more articulable details into the bugs. Since the lighting was almost always overhead then I could easily collage together different limbs and then rotate the color hue to match. I feel like the Frankenstein of insects! Muahahaha!
The next step was to make these bugs literally dance. To rig the bugs for animation, I relied heavily on the Limber 2 plugin in After Effects. This amazing plugin allowed me to use an IK rig to realistically animate the legs and give each insect a unique personality. Popping joints is always such a pain and so I was particularly impressed that it includes an anti-pop attribute and then also a length scale attribute that can correct for the missing scale that the anti-pop attribute eats up. I also relied on basic 2D rotation for the smaller legs, antenna, or such. I experimented with enabling 3D rotation transforms in After Effects, particularly for the wings, but it just ruined the illusion when seen close to edge-on and they suddenly felt like paper cutouts. I used the Loopy plugin to very quickly loop keyframes per layer. Something I utilized frequently was a variety of AE wiggle expressions (with dedicated seeds) to give some of the bug parts randomized jittery movements at a specified time frequency. I was interested to learn that I could have both a wiggle expression and have keyframes laid down, and then AE would sum the movements of both aspects into the visible animation, really useful. I also realized that I could use the AE Puppet tool on an adjustment layer so that I could add even more squash/stretch to some of the bugs. Although the AE Puppet tool can be quirky since it's expecting to be given a static image, and if you instead apply it via an Adjustment layer onto an animated comp, then the warp matte can glitch, particularly for sub-pixel areas like hair. Plans within plans within plans. Comps within comps within comps.
From here it was animation time! Since this is a special VJ pack, I allowed myself 2 months to work on it in total. One month to make the bug library and rig the puppets and then another month dedicated to manually animating the bugs. So with each bug I'd start by imagining how I wanted it to move around, remember the limitations of each rig, and then breathe some life into it. Overall it's just keyframes, a bit of wiggle expressions, and loads of pre-comps. Each of the 42 different bugs are animated with a focus on 120 BPM since that is a decent average for music. This approach also works really well in terms of animation since it translates to exactly 2 beats per second (60 seconds X 2 beats per second = 120 beats per minute). So if I aim for 1-2 keyframes to hit every 1 second, then it'd be easy for VJs to sync the dancing bugs to the live music. And since I’m rendering with a frame rate of 60fps then VJs could easily retime the visuals and it’d still playback smoothly. Although in my tests I realized that keeping all of the movement perfectly in sync was making it feel sterile. I realized that nature is rarely perfectly mechanical and so for some of the limb animations I made use of Euclidean rhythms to keep the movements feeling fresh over time. Also the "static" video clips do not leave the edges of the video frame and so you can place these bugs anywhere on your canvas, or maybe even use Resolume Wire to animate the video in new ways. Each of the "static" video clips are about 1 minute in duration so that the randomized movements have plenty of time to be expressed.
Just as I was starting to animate the bugs, I experimented with adding different FX onto the exoskeleton body so that it would have extra creepy crawly vibes. Even though bugs typically have a rigid exoskeleton it looked really great to bend this rule at times. I used the native FX within After Effects such as Ripple FX, Turbulent Displace FX, Wave Warp FX, Warp FX, or CC Bend It FX depending on what I was looking to do. But I quickly ran into a classic problem where the body would distort too much at the joint sites and make it feel like the limbs weren't really connected. Basically the added distortion was ruining the illusion of it being a cohesive puppet. This problem has cropped up endlessly for me over the years in different forms and so I did finally some research and stumbled across the BAO Distortion Selector 2 plugin which allows for FX to be selectively applied to a layer/comp according to an alpha/luminosity map. This plugin was incredibly useful since I can paint a luma map and then selectively add FX to only certain areas of the bug within a pre-comp, and then continue animating as usual in the upstream comp. Since this plugin is utilizing displacement maps to pull off this trick, it also include a "smooth" attribute to fix any aliasing that can occur, which is so nice to have. I cannot sing the praises of the BAO Distortion Selector 2 plugin enough since it is a game changer for me. I even reached out to the developer with a feature request to add support for adjustment layers and they implemented it! Top notch.
I've always kinda thought that I was a lousy character animator, but I've found that what actually plagued me was the complexity of 3D character animation was often a limiting factor for me and often super frustrating. But it turns out that I really enjoyed the 2D animation process for making my ideas come to life and it's a pleasant wake up call for future projects. A few of the bugs feel a bit stiff for my liking, but there's bound to be some duds when animating 42 different puppets. But as an artist, I try to live by the mantra that one person's trash is another person's treasure. I don't claim to be the best animator but I feel that for some of the bugs I was able to reach the original vision that I had in mind, hence I'm happily satisfied with the results.
Rendering out these comps from After Effects was quite a slog since I was working at 3820x2160 at 60fps with the alpha channel included. I think that having alpha is the greatest thing since sliced bread since it opens up all sorts of layering options while performing or Spout Jamming with NestDrop. I desperately tried to avoid large file sizes by experimenting with a green/blue/fuchsia backgrounds rendered out to an MP4 which could then be chromakeyed out within Resolume, but many of the bugs include a wide variety of colors and so the chromakey FX often keyed out important parts of the bug, which I found to be highly unsatisfactory. So I decided to just bite the bullet and include the alpha channel within the renders. But since that results in video clips with large file sizes, I decided to also offer a duplicate version without the alpha channel. Therefore I rendered out 2 video clips of each bug: MOV with an alpha channel and MP4 without an alpha channel. Although in my tests I realized that for some reason the DXV-alpha codec produces video clips which are half the file size compared to using the HAP-alpha codec and so many of you will rejoice that I've finally embraced the DXV-alpha codec. If you need video clips with smaller file sizes and you're willing to make some compromises, then you can use Resolume Alley convert the resolution to 1920x1080 at 60fps (with or without alpha).
While I typically obsess over adding motion blur into the scenes for enhanced realism, in this context it introduced too many tech issues. For some reason the native motion blur of AE will severely alias when certain distortion FX are applied and I couldn't figure out a solution. But the native AE motion blur was also making the render times much too heavy anyways. So instead I first rendered out everything to PNG frames and then intended to apply ReelSmart Motion Blur plugin. But after I applied RSMB FX and looked at the frames, the bugs were moving too fast and so the motion blur frequently had obvious glitches, especially where the arms were moving. Hence I made the tough decision to entirely disable motion blur for these renders. Although bugs with flapping wings have a Radial Blur FX applied to them, so they therefore have a pseudo motion blur to make it feel correct for that extremely fast motion. Also I had wanted to make a variation of each bug where only select parts of the body glowed in the dark but it didn't look quite right since it wasn't self-illuminating, and couldn't easily do any Global Illumination light bounces, and then I ran out of time.
Personally I think this project showcases an ideal workflow for an artist utilizing AI tools tightly within their work. AI tools don't have to be all or nothing, as seems to be the current popular approach. The Flux AI model helped me to quickly generate and explore thousands of different possibilities at a high quality, which is far more than I would ever have been able to achieve by doing physical photography of bugs and then cutting out and collaging them together. Honestly the huge scope of that approach always scared me away from starting this project and so it's interesting to look back and see how things have changed. After using AI tools to generate the library of alien bugs, then everything else was was done manually by me (curation, cutouts, collaging, rigging, animation). I think it's absolutely critical for there to be a human in the loop and guiding the creative decision process. Since I'm only a single artist producing these VJ loops, using an AI tool in this way is quite transformative and it's hard to ignore how it opens doors to creative ideas that were previously impossible in the past.
Do you want to animate these bugs on your own? If you want to dive in and animate these bugs in new ways, I've decided to release the project files for this VJ pack! This includes the (x42) layered Photoshop files, (x1) After Effects scene containing all of the original keyframed animations, (x2,933) original images generated using Flux and a few using Midjourney, and (x1) LoRA for the Flux model that generates a wide variety of alien bugs. The LoRA works great with the Flux1-Dev-fp8 (8-bit quantized version) of the AI model and I used the Stable Diffusion WebUI Forge app to generate the images. The images were cut out into layers using Photoshop 2024. The animations were created using After Effects 2024, Limber 2 plugin, and BAO Distortion Selector 2 plugin. So you will need these plugins if you wish to edit the After Effects scene. More relevant notes are shared within the project files. If you end up using any of these assets within a project then a credit is required. And I'd love seeing it, although no approval is necessary. I'm bugging out!
Discussion (0)