This pack contains 113 VJ loops (114 GB)
Behind the Scenes
I’ve had this daydream of watching an alien plant growing and mutating. So for the "Plants" scenes I wanted to train StyleGAN2 on a collection of images of different plants and see what happens. Yet I wanted a perfect black background. After some sleuthing, I realized that using photos of real plants wasn’t ideal and instead I should focus on the botanical drawings. I ended up using the drawings of Pierre-Joseph Redouté (1759-1840). I spent a whole month just downloading, curating, and preparing the 1,309 drawings for this dataset. Since I really didn’t want StyleGAN2 to learn any unnecessary aspects I had to photoshop each image manually. This involved white balancing, painting away any noise, painting out any paper edges/folds, painting out any text, and cropping to a square. It was an large amount of boring work but worth it in the end.
After training the SG2 model to a point where I was happy with the results, I started curating the best seeds and rendering out the videos. then I sat on the videos for a little while due to a simple but annoying issue. All of the videos had a white background... I tried all sorts of experiments with color keying and nothing looked good. But then I remembered a useful trick of simply inverting the colors and then rotating the hue to match the colors of the original video. From there it was all fun compositing in After Effects. I keyed out the black background and then I brought together multiple plants into a single comp and made it feel as though they were slowly growing up from the bottom. I experimented with adding glow to every color except green which resulted in all of the flowers having a rich vitality. I also experimented with a few plugins such as Modulation, Slitscan, Time Difference, Pixel Encoder.
While watching some workshops about StyleGAN2, I become a patron of Derrick Schultz and happened to see that he shared two pretrained models. The StyleGAN-XL 512x512 and SG2 1024x1024 models were trained using botanical drawings from Jane Wells Webb Loudon (1807-1858). These were really fun to jam with, many thanks Derrick! The "Blossom" and "Garden" scenes are the result of rendering out videos from these models and compositing experiments in AE.
For the "Mushrooms" scenes, I had be doing some explorations in Stable Diffusion and was able to nail down a text prompt that consistently output fields and forests full of mushrooms. So I rendered out 7,639 images and used it as a dataset for training SG2. The transfer learning converged rather easily and from there it was all fun in AE. The slitscan AE effect really made this one extra psychedelic. Chefs kiss to the BFX Map Ramp AE effect since it allowed me to tweak the color gradients and then apply Deep Glow.
I then did some further explorations in Stable Diffusion and found a text prompt that would output flowers on bizarre vines. So I rendered out 17,124 images and used it to train SG2. Yet I probably gave the text prompt a bit too much creative freedom and so the dataset was quite diverse. This resulted in some issues with the StyleGAN2 transfer learning being able to converge. This has happened to me on several prior occasions and I typically just move along to something else. But this time I did some research on the Gamma attribute since it's known to be a difficult but important aspect in fixing this type of issue. After numerous Gamma tests I was able to improve it, but not to the level I was hoping for. But I think the "Flowers" scenes are beautiful regardless. It was satisfying to use the Color Range effect in AE to remove certain colors and then apply Deep Glow to the leftover colors.
It's surreal to see these various AI techniques mimicking mother nature and just making a beautiful mess of it. I tried to further push this digital feeling while compositing in AE and make it feel even more glitchy, as if the Matrix code was showing through. Green machined leaves.