Massive Projections at the Illuminarium
When we first tested both a 15 and 30 node projection system running in sync in Unreal and Disguise, we knew we were doing something big...
Panoramic images of the main room (2) at the Atlanta Illuminarium with and without projections.
In November of 2021, we started a lengthy journey with the amazing folks at the Illuminarium. Their goal was to create interactive, responsive graphics across their 10,000 sqft floor using the LiDAR system built into the ceiling, via Unreal. We signed on as Unreal Engine support for getting this fully interactive floor concept to function. They also engaged us to design and execute interactive portions for multiple shows including "Space", and "Georgia O'Keeffe", but more on those later.
Our first task was to build an nDisplay system in Unreal that was compatible with Disguise and get that working across 15 nodes (more nodes than were generally used in Virtual Productions). The full floor resolution is running at roughly (a bit more than) 11k by 8k. It is diced up into 15 RX rendering nodes across 4 VX streaming nodes. To get it all working consisted of a careful balance of network engineering, D3 engineering, and Unreal optimizations. 2022 was a huge year of learning and building. We started our testing in their Lab Space, which is a smaller setup of 2 projectors and 2 nodes for the floor. As we moved into the full projection space, we found the need for further optimizations and techniques to handle the significant networking traffic and sync. After that core system was running at 30 and 60 fps, we began to consider content and a template project to make it easier for other teams to pick where we left off. The Illuminarium team was working with a number of very talented studios on separate projects, and needed a unified starting place for their Unreal configuration. We generated a simple template project with the proper nDisplay configs and settings to ensure compatibility with anyone publishing to their system.
Lots of detailed and hyper-specific knowledge was built up around determinism and sync issues that are sometimes shared by virtual production teams using D3. We'll go into some of that in our next post on interactivity. We ran into a number of issues related to working in UE 4.27 at a time when 5.x was not yet available. We were further delayed on jumping to UE 5 due to the need for compatibility with D3 and renderstream, which is just now becoming available. We had some amazing help from the folks at Epic Games to back port some alpha channel fixes that existed in development versions of UE5. This was essential in allowing us to composite interactive elements on top of previously rendered material in real-time on the floors. This first stage of getting a system working was an exciting time for us as well as the Illuminarium. We'll detail more about the process in future posts.