The ‘Bye Bye’ family souvenir video is the only excuse to keep myself from getting rusty in 3D and Video stuff (as my main work is taking a fair bit of my time).
This year, no particles, no wave spectrum. I decided to ramp up a bit with Modo, as I might just stick to Softimage 2014 from now on, and invest my hobby money in Modo instead (seems a more complete toolbox for the hobbyist, and the amount of changes I was getting for Softimage for the price of the subscription did not seem to make it really worth it).
So here, it’s a simple parallax trick around 2013. I sketched the 2013 and got the various elevations (0-5) and whether a block top was flat, descending or raising. I wrote a little python script to convert those values into a grid data that I could feed directly into Softimage ICE using the String to Array node. That gave the tree a final destination as to where to grow the points making each cube. The grid in itself was done using ICE geometry, replicating a simple cube and tagging the proper material depending on whether it is a ‘black’ square or a ‘white’ one. Finally the whole thing was exported as MDD and loaded in Modo for the lighting, shading and camera work, and of course final rendering. I must say all in all it worked quite smoothly! If you are interested, here is the Softimage scene: byebye2013.scn.
So what to say about modo? I’ve tried to love it for so long, but it never clicked like Softimage did. I’m never quite sure what to expect really, where Softimage it mostly always make sense. I’m saddened that Softimage seems to be stalling. I’ll sure keep using it as much as possible as I have a deep investment in it, but for a hobbyist, I much rather fork $400 to modo every other year for substantial update, rather that $900 a year for minor update (and now it would cost me 3k I believe!). Hopefully modo 801 will bring me joy!
Finally done, the 2012 Bye Bye video. That is my main excuse to not get too rusty using Softimage, Premiere and After Effects… So this is a wrap up of our best family and friends pictures and videos mixed with a Trance Track (Dash Berlin feat. Emma Hewitt – Waiting).
Technically, since I do have a limited time (and resource, my PC was churning 1 frame every 6 minutes!) I wanted to go with a simple concept, both in term of modelling (limited skills!), animation and rendering. And even there I had to trim it a bit down.
So this is a simple model built and animated in Softimage. There is ICE in there! It is the spectrum bars that yes are supposed to follow up the music, I used my SpectrumToWave plugin described in prior posts. ICE is also used for the counter: it is actually 3 particles each one assigned a value between 0 and 9 depending on current time. This is then used to pick the Instance Shape for that particle (there are 10 objects, each one representing one digit). The also track their position and 3 nulls bound to the surface of the memory pod.
The fake somewhat depth-of-field effect is coming from Magic Bullet Looks filter that is applied in the Premiere sequence. I used different looks for all the videos. Magic Bullet rocks!
It’s been something that I wanted to achieve for a quite a while, and finally got the right mix and enough to get it to work. Picking a WAVe file, get the spectrum decomposition of it (using a Fast Fourier Transform library named KISS FFT) and finally wrap the whole thing into a Custom ICE Node so that I could use it to drive particles.
Here is a few trial, one using a typical spectrum bars, and the other one using spectrum information to drive strands and deformation.
I’ve also made a tutorial on how to use the plug-in:
And finally here is a screenshot of the ICE Tree (there is a smaller ICE Tree before that that sets for each point its frequency, a normalized value between 0 and 1 covering the full frequency range from the audio file):
Here is a new little ICE video/how-to I posted on Vimeo. Lagao is all about point cloud simulation, and while it can use a mesh to mold the emitted points, it does not directly deform the mesh.
So I made this little compound which binds back the mesh to the point cloud simulation. Since it is only taking the closest point, it is a bit crude and requires a rather high density on the point cloud simulation. I’m thinking it can be improved by taking more than one point, and why not, apply a barycentric coordinate on those.
Here is another flocking experiment: now with Strands! It’s a typical flocking setup, particles are attracted to the particles of similar color, and push back of the one with opposite hues. In addition there is a gravity force forcing them to stay within an orbit.
So after playing with the particles, I was thinking of something to do to start playing with ICE Kinematics. Thanks to the Pixelcorps, they gave me the idea: some insect procedural walk. To keep it simple I decided to do a Centipede walk.
The video shows an overview of the logic and finishes with a fly over of the actual ICE tree. If you want to dig deeper, you can always download the scene here: centipede scene – Softimage 2011
This is really a 1st attempt. There are many things to do if we want to an ‘agent-like’ centipede, able to react to its environment. On the other hand it might already be too fancy if you want to pre-plan the path to travel, although I think the tree could be adapted to do that.
I’ve always been intrigued by emerging behavior and thanks to ICE I was finally able to devote some time for experimentation. So here are a bunch of particles trapped in a bottle and constrained by the typical 3 rules which each generate their own force:
cohesion: each particle is attracted by the particles of its own kind, as long as they are within a given radius and in the field of vision
alignment: each particle aligns to the average direction of its neighbors
separation: strong but very local force that prevents collision
I have found ICE was a very good playground for this. I must also mention Programming Game AI by Example by Mat Buckland for its good coverage of steering behaviors.
And that piece would just be a bunch of dots in a bottle without the music from Pierre Lapointe. Try to listen to some of his songs, I think they must be beautiful even if you do not understand its chiseled lyrics.
I was browsing Autodesk’s AREA when I stumbled upon a 3D Max tutorial by Sasha Henrichs on using noise and deformations to model a stone procedurally. I thought that was a great idea to start exploring ICE deformation.
So it starts with a very high res sphere (made of quads), wrapped into a lattice. An ICE Tree is then applied which pushes points along their normal based on different level of noise (coarde, medium, fine). Since all deformations are actually on the mesh itself, you really need a high resolution.
Once you are happy with the stone, you can freeze the ICE tree, maybe bake some high level of details onto a normal map, or apply a polygon reduction.
I have always been intrigued by the vegetation generators, and so I finally decided to give it a try. So this flavor uses:
Strands to grow the branches: a state machine drives the branching and the level of nesting.
Billboards particles for the leaves: leaves are 2d images with an alpha cutout projected on Quads. The Quads sizes and Orientations change based on the age of the particles.
I have also done a tutorial in PDF that you can look at if you want to know more: ICE Vine Tutorial. It contains a detailed explanation with ICE tree screen captures. Hopefully that can get you going or spawn your own ideas!