REALTIME

Explore a gallery of real-time 3D environments, meticulously crafted to leverage the full potential of the GPU. The collection features immersive experiences in VR and MR, along with interactive Unity 3D presentations designed for touchscreens and beyond.

Every real-time 3D scene showcased on this page is the result of a bespoke export pipeline that I have meticulously refined. With each project, this pipeline has been further honed to ensure the highest quality and performance in 3D presentations.

The real-time 3D content showcased here was all created using Unity 3D. It employs a custom PBR/Full Pass Reconstruction shader, enabling dynamic adjustments in lighting solutions and material properties in real time. This feature is particularly advantageous for demonstrating material changes, enhancing the interactive and realistic nature of the scenes.

To facilitate rapid import of 3D and material data, a series of custom editor tools were developed in Unity. These tools handle data exported as a custom JSON from 3DS Max, utilizing MAXScript I have written. You can read more about the real-time 3D pipeline tools here.

567 CLARKE PENTHOUSE

The 567 Clarke Penthouse VR experience, developed as an innovative sales tool for Marcon, revolutionized the presentation of their luxury penthouse units. This immersive, room-scale VR journey enabled prospective buyers to navigate the space in real-time. It also offered the unique feature of viewing the unit under varying lighting conditions, simulating different times of the day, and allowed for a comparison between two distinct material finishes. This cutting-edge approach not only enriched the customer experience but also played a significant role in boosting sales.

The video featured here was operated on an HTC Vive, integrated with "Leap Motion" hand tracking technology, anchoring users more firmly within the scene. It highlights an interactive element where a light switch is pressed, altering the scene's lighting. Additionally, a voice command is used for scene adjustments. This elimination of traditional VR controllers made the interaction more intuitive, especially for non-gamers, significantly enhancing the user experience without overwhelming them. This approach proved to be highly successful in making the experience more accessible and engaging.

Alongside the pipeline tools mentioned above I also created the lighting solution and materials for this scene. All lighting that you see is from baked maps in VRay and then brought over to Unity using the custom shaders.

The tablet controller and interactivity were created in collaboration with Designstor.

TABLET INTERACTION

The virtual tour can be seamlessly guided using an iPad, giving sales agents the capability to navigate users through the experience. With this control, agents possess the same functionalities as the user, such as switching between different spatial configurations, material choices, and times of day. Additionally, they can position the user at specific locations within the penthouse, offering a tailored viewing experience.

GALLERY

SERIF

SERIF, a mid-rise development situated in the heart of downtown San Francisco, boasts two touchscreen displays that present a real-time 3D model of the building. These interactive displays offer a detailed exploration of units, floor plans, views, amenities, and other features, enhancing the experience for visitors and potential buyers.

The animation showcased is a real-time 3D capture from Unity, operating on a Windows Desktop setup. The lighting in this 3D environment is sourced from lightmaps created in 3DS Max/VRay, processed through my specialized real-time 3D pipeline. This pipeline effectively translates all the material and geometric data for use in Unity. The trees in the scene are generated using SpeedTree, while the bushes are the result of a custom solution designed to convert iToo Forest scatters into a format that is parsed in Unity.

Please view the images showcasing the final real-time 3D building in the presentation. These visuals highlight advanced filtering and interactivity elements developed by CreativeDirection.io, demonstrating the sophisticated integration of design and technology in the final product.

Microsoft Hololens

I have extensive experience working with the Microsoft Hololens, a leading Mixed Reality (MR) device. My expertise extends beyond just programming for the Hololens; it involves achieving stable hologram performance through rigorous optimization of both the real-time 3D content and the underlying code. This ensures a seamless and immersive MR experience.

The video showcases shared tracking experiences, developed in collaboration with Designstor. It features a unique capability where multiple users in the same room can simultaneously view and interact with the same model in a virtual space. This innovative technology enables participants to physically point and interact with virtual objects in 3D space, creating an incredibly immersive and collaborative experience.

Development for the Hololens was carried out using C#/Unity, incorporating a custom real-time 3D baking pipeline. This specialized pipeline was instrumental in enabling detailed models with shadows to function with incredible efficiency, ensuring high performance and visual fidelity in the Hololens environment.

CAMH

The CAMH Queen West campus leveraged a VR experience to facilitate the design approval process for a new facility. Key areas were modeled and explored using an HTC Vive headset, allowing stakeholders to evaluate the layout for efficiency, visibility, and patient safety. This immersive approach provided immediate, experiential feedback to designers, enabling necessary modifications.

Weekly updates to the VR experience were a staple, with stakeholders participating in hour-long sessions to critique and suggest alterations to the design.

To make these design sessions more efficient, an iPad controller was integrated into the VR setup, bypassing the need for extensive controller training that was previously taking up valuable time. This allowed architects to effortlessly guide the VR experience from a bird's-eye view, navigating across various discussion points with ease.

Further improving the VR experience, physical chairs were equipped with trackers to replicate their position in the virtual environment, offering users a realistic sense of space from behind a care station. For a more user-friendly interface, iPads were provided as a straightforward alternative to VR controllers, streamlining the process and conserving time during the design reviews.

I developed the pipeline for this virtual experience, focusing on speed and efficiency during model reviews. Directly importing models into Unity, a suite of scripts was executed to apply a grid material for scale reference and Screen Space Ambient Occlusion (SSAO) to render basic shadows. Furthermore, points of significance, such as patient areas and the tracked chair, were accentuated in blue for easy identification.

This project underscored the practical benefits of VR technology more than any other I've been involved with. It was gratifying to witness medical professionals engaging with the virtual space, concentrating on crucial aspects like visibility and accessibility, rather than on the mechanics of controller usage or the next steps in the process.