Automated Photogrammetry Pipeline
"Houdini really can do anything!"
-Debra Isaac, SideFX Mentor
In an ongoing research project to create a better photogrammetry pipeline for games, SideFX Houdini has presented itself as my personal "one stop shop" for quick and automated content creation.
This node graph contains the basics of the pipeline setup. The top of this system takes in the photos and naming system, which then processes and builds geometry based on that input.
From there, there are additional steps for cleaning up the raw results before making its way into the game development portion of the pipeline.
Using the same logic as digital sculpting tools, this pipeline bakes the high amounts of detail onto a much smaller and more manageable mesh. Once baked, these assets are ready for any game engine, but can also be customized further.
For pipeline examples, click here.
Prototype Demonstration - Wireless Configuration
Unreal Engine and Optitrack:
Virtual Camera System + Live Composite
First Test Recording - Realtime Composite + Motion Tracking
"Wouldn't it be cool if..."
-Nick J, Drexel Professor
What started out as an independent study in VFX Compositing very quickly turned into a James Cameron-esque dive into realtime content creation.
Inspired by the recent release of Disney's Lion King, myself and fellow Drexel Undergrad Jenn Raimondi initially sought to create a virtual camera to be used in Unreal with the assistance of an Optitrack Motion Capture System. The result quickly grew into a homebrewed Simulcam, similar to ones used in modern film making.
In the final weeks of the study there were three main components that made up the system:
1. Unreal Green Screen Widget
(made by Jenn)
2. Motion tracked camera rig with Sony FS100 and BlackMagic Monitor.
3. HPZ Wireless Backpack Computer, courtesy of Intel and Drexel University.
Although this is a gross simplification of the setup, the results were surprisingly effective. By attaching the HPZ backpack to the shoulder mounted camera rig, we were able to create a working prototype which ran completely wireless into Unreal Engine.
This system would then be further adapted with higher resolution cameras, HTC Vive capabilities, and is currently being outfitted for use with Vicon Shogun.
Immersive Research Lab (IRL)
"Tell me what can be done, and let's make it brilliant."
-Michael Wagner, DIGM Dept. Head
Back in the Summer of 2018 I got the rare opportunity to build a research lab from the ground up, and dedicate it to furthering immersive media.
Before IRL, there was a computer lab known by students as "The Fish Bowl." This lab and it's computers were going to be repurposed, and because of my role in Drexel's Extended Reality Club (DUXR) I was asked what students might want to see in the new space. In that moment, I knew that there was great potential for growth in this new space.
With a substantial amount of help from the university funds and a few cases of energy drinks, I was able to build the truss system along with configuring the lights, ambisonic sound system, and two motion capture systems that are in use today.
This new lab also helped spark an interest in Drexel University's recent addition of the Virtual Reality and Immersive Media Major. Today there is a small group of students, myself included, who regularly make use of the research space for XR related work.
For an in depth tour of the space and it's uses, click here!