With the very best thanks to the VJ team "Mandala"
for this Artfull workout and the sharing with us, like for their report hear at VJun.io, the links & infos to "Machine Music x Art 2.01"!
It would also be very interesting to find out in more details which software and hardware setup was used at the liveset. In the "opening credits" i can already see something about, like Resolume, which apparently formed meaby the central outputinterface. I'm assuming it was Arena, of course, but maybe Wire was involved as well. I see 4 monitors on the cover picture. I can see Resolume Arena on one. What's going on on the other and how was the signal flow organized.
Which software was used for post-production (3D animation / video post-processing and so on?). And so i ask me: which parts were created live (realtimevideo) and what was the pre-produced content? Maybe even some background information about the creation of one of the contents used or even a small tutorial.
Also additional links to the artists Xavier and Adem Jaffers (aka VJ Mandala), like their own websites, Vimeo, YouTube. etc. could be added as an addition. So that we could find out more about them yourself, such as the Artist-Bio or upcoming appearances at shows & events, etc. .
In the video, two or three things are already thinkabout for me, and I wonder to what extent fragment shaders were used, with what means they were playout/output and integrated. In particular, I got the impression that a lot of work was done with particle systems, although I would be interested to know whether the real-time or poast productions were created and then how they were designed.
Incidentally, I would like to draw your attention again to my 78 free plugins for Resolume Arena (works up from 7.13.2 full or partial up from 7.7) in connection with the software used for "Machine Music x Art 2.01":
With the very best thanks to the VJ team "Mandala"
for this Artfull workout and the sharing with us, like for their report hear at VJun.io, the links & infos to "Machine Music x Art 2.01"!
It would also be very interesting to find out in more details which software and hardware setup was used at the liveset. In the "opening credits" i can already see something about, like Resolume, which apparently formed meaby the central outputinterface. I'm assuming it was Arena, of course, but maybe Wire was involved as well. I see 4 monitors on the cover picture. I can see Resolume Arena on one. What's going on on the other and how was the signal flow organized.
Which software was used for post-production (3D animation / video post-processing and so on?). And so i ask me: which parts were created live (realtimevideo) and what was the pre-produced content? Maybe even some background information about the creation of one of the contents used or even a small tutorial.
Also additional links to the artists Xavier and Adem Jaffers (aka VJ Mandala), like their own websites, Vimeo, YouTube. etc. could be added as an addition. So that we could find out more about them yourself, such as the Artist-Bio or upcoming appearances at shows & events, etc. .
In the video, two or three things are already thinkabout for me, and I wonder to what extent fragment shaders were used, with what means they were playout/output and integrated. In particular, I got the impression that a lot of work was done with particle systems, although I would be interested to know whether the real-time or poast productions were created and then how they were designed.
Incidentally, I would like to draw your attention again to my 78 free plugins for Resolume Arena (works up from 7.13.2 full or partial up from 7.7) in connection with the software used for "Machine Music x Art 2.01":
mywix3.wixsite.com/bennoh/post/50-...
I would be very happy to see my plugins in your workout at an upcoming "Machine Music x Art".
With best regards from Switzerland/Europe by bennoH. 🐼