Generating illusionary organisms for the game-engine ecosystem
This project is my case-study from my time at the Animation Workshop’s Creative Simulation Technologies course in Viborg, Denmark. The course was led by Andrew Lowell and sponsored by the European Union Creative Media program. If you are curious about the technical aspects of this project, I have posted Unreal Blueprint tutorials on freehand splines and Houdini Engine on Youtube to further document my process.
I present an inquiry on the ecology of real-time graphics: Can an artist draw digital ‘organisms’ out of thin air? Can they exert influence over these organisms, imparting life and embedding them into an environment? In a future where media art is increasingly real-time and generative, can digital artists manage to experience embodied creativity through procedural content generation (PCG)?
To address these questions, I prototyped a speculative set-dressing system that allows artists to draw and manipulate organisms and their environment using freehand input, inspired by the way illustrators have adopted digital art tools like Procreate and Blender’s GreasePencil. To accomplish this, I connected Unreal Engine’s brand new scriptable tools interface to Houdini Engine to create an embodied ecosystem for real-time PCG generation. I use the term ‘embodied’ to describe the system’s intention rather than its current state, as full embodiment would require a gestural interface and deeper graphical immersion. Despite my technical limitations, the affirmation of embodiment still informed all of my design decisions, and I created an interface that allows artists to reach their creative target without a complicated UI, overwhelming asset management, and other common technical bottlenecks. 
By emphasizing artistic flow and playfulness, the artist becomes a part of this virtual, organic ecosystem, fully immersed in their digital craft.
The videos below are real-time captures from Unreal Engine.
Game Engine Recordings
All of the organisms in one environment. Set-dressing this scene only took me ~10 minutes in realtime!
Organisms can thrive in different contexts, even transforming into a living, breathing wearable
The ecosystem is alive and interactive
Generated assets can embody multiple biological levels: from a unicellular organism to a large-scale ecosystem of its own (as seen above)
Shaders are procedural, enabling full exploration at both the micro and macro level
Eraser: Each time a Houdini asset is generated, an accompanying spline and collision mesh are also added to the Unreal map. To traditionally remove a procedural asset, the user has to spend time sorting through the Outliner for it and its extraneous components, severely disrupting their flow. To address this, I created a blueprint that keeps track of HDAs and their components in memory, enabling the artist to brush over a proxy asset in the viewport and easily delete it from the scene.
Camera: This tool simply swaps the camera position to one of two presets of the artist’s choosing- which is very handy when the artist tumbles around to place elements but wants to quickly return to their locked-in camera angle(s) without worrying about overwriting them. There are many other ways to swap cameras in this Unreal, but they take quite a few more clicks.
Play: This tool lets the artist interact with the procedural organisms, exerting influence by nudging and poking them around. This is the most underdeveloped part of my project, as Houdini Engine has low-latency and cannot accommodate realtime physical deformation, so I had to use the shader-graph to mimic this instead. Using a PCG system that is better integrated with Unreal allows for more tangible interactions, as demonstrated with the Niagara-powered tentacles shown here.
Adaptive Scaling: When spawning tentacles, their length is determined by the artist’s proximity to the object that they are spawning from. If the artist is using a fixed camera, this ensures that the tentacles in the foreground are short, so they don’t obscure any background elements.
Exposed Parameters: After the artist draws an asset, they can still finetune user-defined parameters (such as color, size, randomness, etc.) through the asset’s detail panel in Unreal because of Houdini Engine plugin’s built-in integration. This allows the artist to further art direct the procedural asset using a traditional PCG interface in addition to the drawing tools I implemented.
Cancel Operations: Unlike other digital art drawing applications, procedural geometry takes time to generate, and the lag following a misplaced brush stroke can be frustrating. By clicking alt while drawing an asset, an artist can save time by canceling the operation mid-stroke. It is also possible to automatically cancel asset generation if the input spline is drawn too large or is out of bounds.
Finalization: Once the artist is satisfied with all their drawn PCG elements, they can click the ‘refine’ tool to automatically refine every proxy asset in their scene, or right-click to refine an individual asset via Houdini Engine built-in panel. Refined meshes will not be removed with the Eraser tool, and will persist after the project is shut down, ensuring their permanence in the scene.
Spline Projection Modes: Freehand splines can either be drawn on geometry (any objects that have a collision mesh) or projected onto the screen plane, depending on the artist’s desired application.

BIG thanks to my teachers: Andrew Lowell, Adam Funari, Julia Silkina, Jeffy Mathew Philip and Kate Xagoraris, as well as the support of Per Kristensen and the Animation Workshop! 
Unreal Engine and Houdini. Audio from Soundsnap.​​​​​​​
Back to Top