top of page
  • andreww290

Interactivity at the Illuminarium

We mentioned interactivity in the first post about the Illuminarium, but we didn't tell you how it works. Here's the basics: The Illuminarium has an array of LiDAR units installed in the ceiling of each of it's venues. They aim downward and capture a point cloud of all users in the space. The point cloud data is run through alignment and image processing software custom built by the Illuminarium to generate a few outputs. First the data is processed into a black and white blob-map of where people are located. The blobs are tagged with ID's to track which user is which in a persistent way. The ensuing data stream transmits unique ID's and normalized location coordinates as a low latency TCP stream, at approximately 20 FPS. We read this data in Unreal Engine and assign it to game actors in order to trigger events, collisions or anything else we can imagine.

The other data being output from the lidar image processor is an NDI video stream of the blob map itself. This aligned NDI stream can be much more efficient than simple location data for generating certain types of shader FX, so depending on the effect needed, one or the other can be used.

The LiDAR system we worked with for most of 2022 did have its limitations. Occlusions, or shadowing of the data by furniture or large groups of people huddled together can make keeping continuous tracking impossible at times. Also, 20 FPS is short of ideal for interactivity. But as in all things, the limitations must steer the creative, and we were fortunate to find ways around the issues. We are currently in the final test phases on a new system that relies on OSC instead of TCP and utilizes some excellent error collection and furniture removal. This was made possible by the excellent staff at the Illuminarium and their commitment to continually improve their system.



Our first show was a tribute to the work of Georgia O'Keefe, directed by Randie Swanberg. Randie gave us an excellent launching point with the beautiful animations of blooming flowers and wild streams of petals projected all around the space. We added interactive petals, created using a deterministic Niagara particle system. The audience member locations were used to spawn whirlpools which would track along with the viewer, swirling the flower petals around their feet.


In another section of the piece, we used the NDI feed to create paint strokes in O'Keefe's palette behind the user as they walked around the space, allowing the audience to contribute to the art themselves.


We also created effects which allowed the audience to part seas of blooming daisies and light storm clouds from underneath. The daisies were done using another Niagara system which spawned and bloomed the flowers, then let the users act as collision objects within the field. The cloud interaction was done by piping the NDI feed directly into the dynamic cloud shader. Each interaction took advantage of the Illuminarium's system in it's own way, to bring the audience closer to the work and make them all feel connected.



For the "Space" project, directed by Gavin Guerra, we needed to do some more advanced interactive systems that integrated complex blueprint logic. These necessitated the use of Cluster Events where a single node handled events and broadcast them to all the other nodes. We used them for what has proven to be the most popular interaction of any show, the Asteroids. When a user collides with a drifting Kuiper Belt asteroid, it explodes into pieces and ejects dust into space. The kids go nuts.


Another interaction inspired by the footprints in the first Illuminarium show, "Wild", lets users leave their boot prints on the lunar surface as they walk around. This one uses a predictive system to make up for some latency in the lidar system. We estimate where a user is walking and offset the footprints forward to where they actually are after about 50ms. This allows the footprints to better align with the average walking speed. We also use this logic to extrapolate the users trajectory to make sure the prints are facing the right way!


We're still developing more elements for Space, including a set of space platforms which launch thrusters under the users feet. Stay tuned for more exciting news on that front. And in the meantime, if you're in Atlanta or Las Vegas, go check out the shows!









154 views0 comments

Recent Posts

See All
bottom of page