Houdini R&D Experiments #1 (2019)

Exploring how to work Houdini into realtime pipelines

My interest in Houdini grew to a top priority late in 2017 once I realized that it could be useful for Indie developers for generating 3d assets for worlds. I did all of the art and programming for my VR music visualizer Raybeem so I figured that if I continued with small team sizes, I needed to be able to automate what I could.

With a steep learning curve, it's been a challenge to fully grasp Houdini. It's all starting to make sense to me now. This post is a collection of the various demos that I've done in the past year, in an effort to become Houdini fluent.

Procedural Modeling

An example of using a curve input and Houdini generating an asset from it. In this case, it's aligning the input by specified grid rules before generating the model.

I'm working on more scenes that take advantage of procedural layout. In this demo I'm exploring the placement of pillars within a scaling room.

Terrain

Terrain model generated in Houdini. Areas are masked and exported into separate color channels.

Terrain splatmap generated from Houdini. From left to right (RGB, Red, Green, Blue channels).

Displacement exported from Houdini and imported into Unity as terrain. Material splatmap imported into Unity terrain.

 

The next step would be to combine the procedurally generated terrain with instanced grass and vegetation as demonstrated in this instanced grass shader demo.

I'm also working on combining with a river tool. You input a spline curve and it'll project a river down onto the surface of a terrain.

Houdini Terrain in WebGL

I've been doing demos for a fully 3D WebGL version of my portfolio site for years now. Before this version of the site, I was exploring a BabylonJS/Unity pipeline. I created a demo that pulled in a Houdini terrain in as an FBX. The idea was to make the site like browsing an RPG world map, like in early Final Fantasy games. When estimating the time it'd take to get the splatmap shader working in this pipeline, along with other assets I needed to make, and figuring out UI interaction. I decided that it was too much work at the time. I went in the direction of this current site and will explore this 3D website concept more down the road.

Rigid Body and Effects

Rigid body destruction experiments. The challenge has been to figure out how the rigid body systems work and find the limitations with bringing in this data into a realtime engine like Unity and Unreal Engine.

Some rigid body simulation tests. Helped me to grasp an understanding of fracturing and glues/constraints, as well as working with a rigid body solver.

Using simulated fluid geometry as particles within Unity.

This Unreal Engine 4 demo uses a cloud mesh generated in Houdini as a surface for spawning cloud particles.

I got an understanding of how to use solvers with a flip fluid sim to simulate raindrops. I'm not sure this is a good technique for games but I can theorize a way that you can make this method work in a realtime shader.

First Production Work!

Finally got to make use of Houdini for an AR project for the Dolby SoHo mobile app.

These are some of the AR animations I created for the project. In Unity, thru the shader, they reveal themselves based on their UVs.

This is the network I cobbled together for this particular model.

I've got a long way to go but it's a very rewarding experience. I feel that Houdini is a great tool to make use of my creative and technical capabilities.

Stay tuned!

-Bryson


Back to Work