Supported by the Office of the Vice President for Research, and Temple Libraries, the Loretta C. Duckworth Scholars Studio will host VR@TU: “Real Research in Virtual Realty” to showcase VR related work at Temple and to begin a conversation of how we envision the future of VR at Temple. The event is moderated by Marcus Bingenheimer, Olivia Given Castello, and Alex Wermer-Colan.
After a keynote presentation by Kent Bye, panels, presentations, and lightning talks will show how VR is used for research and teaching.
To watch a recording of this event, visit the Temple Library webpage.
Welcome (Marcus Bingenheimer)
Kent Bye “Ultimate Potential of VR: Challenges & Opportunities for Higher Education”
10:10-10:30: Q & A with Kent Bye
10:35-11:00: Panel 1
Benjamin Seibold “Re-Shaping High-Performance Computing via Interactive Virtual Reality” (10 min)
W. Geoffrey Wright “What’s ‘up’ with VR? How to stay balanced while immersed” (10 min)
11:05-11:30: Panel 2
Gabrielle Gutierrez “Reality or not – Here’s VR: Flexibility and presence in the classroom” (10 min)
Christopher McAdams “Exploring the Unbuilt – A Past and Future of Architecture” (10 min)
11:35-12:00: Panel 3
Peter d’Agostino “VRs / XRs…in the Arts & Humanities” (10 min)
Hector Postigo “Disintermediating Immersion: Infrared mapping vs. controller interface and proprioception in VR Environments” (10 min)
Peter d’Agostino (TFMA) is Professor of Film & Media Arts, and Director of Climate Sustainability and the Arts, Temple University. His pioneering photography, video and interactive new media projects have been included in the international biennial exhibitions of Sao Paulo, Brazil; Gwangju, South Korea; and the Whitney Museum of American Art, New York.
Kent Bye Since May 2014, Kent Bye has conducted over 1500 Voices of VR podcast interviews featuring the pioneering artists, storytellers, and technologists driving the resurgence of virtual & augmented reality. He’s a philosopher, oral historian, & experiential journalist helping to define the patterns of immersive storytelling, experiential design, ethical frameworks, & the ultimate potential of XR. You can follow his work on Twitter @kentbye and the Voices of VR podcast.
Gabrielle Gutierrez (FOX) Gabrielle Gutierrez is a Video Production & Editing Assistant in the Online & Digital Learning department at the Fox School of Business. Working as a video editor specialist she develops video content for asynchronous learning within Fox. Her career as a VFX artist focuses on 3D animation, game development, and video editing which allowed her to transition into developing a pilot VR course offered in Fox’s online graduate program. Having experience in the VR entertainment industry, her technical direction produced immersive virtual environments for students.
Christopher McAdams (Tyler) is Assistant Professor of Instruction and an architect working with augmented and virtual reality platforms to investigate immersive and interactive spaces. He uses parametric tooling, digital fabrication and augmented reality and virtual reality to generate new opportunities in the built realm and explore unbuilt work from history.
Benjamin Seibold (CST) is Associate Professor of Mathematics (CST) and the Director of the Center for Computational Mathematics and Modeling. He works on the modeling and simulation of real world processes (traffic flow, invasive species, radiation transport) as well as the development of novel efficient computational methods.
Hector Postigo (Klein) is Associate Professor at the Klein College of Communication and Media Studies. He holds degrees in Physiology and Neurobiology and Science & Technology Studies, and his research has been part of fellowships at Stanford CASBS, Yale’s ISP and Rutgers School of Law and projects for the NSF and EU. In the field of VR studies, Postigo studies the comparative impact of infra-red motion capture interfaces on priopioception.
W. Geoffrey Wright (CPH) is Associate Professor and Director of the Neuromotor Science Program in the CPH. He has degrees in Neuroscience, Engineering, and Experimental Psychology, and teaches in the Physical Therapy program. This background has shaped his clinically-focused projects using virtual reality over the last two decades.
Keynote: Kent Bye
Title: Ultimate Potential of VR: Challenges & Opportunities for Higher Education
Abstract: Since May 2014, Kent Bye has conducted over 1500 Voices of VR podcast interviews featuring the pioneering artists, storytellers, and technologists driving the resurgence of virtual & augmented reality. He’s a philosopher, oral historian, & experiential journalist who has been helping to define the patterns of immersive storytelling, experiential design, and ethical frameworks for XR. He’s been asking pioneers about the ultimate potential of XR for the last six years, and he’ll be presenting an overview of the landscape of XR, an experiential design framework to help describe the unique affordances of XR, and some of the latest trends in immersive education. He’ll be exploring some of the challenges and opportunities for higher education including assessment, embodied learning, interdisciplinary collaboration, and accessibility. You can follow his work through the Voices of VR Podcast or on Twitter @kentbye.
Peter d’Agostino “VRs / XRs…in the Arts & Humanities”
Abstract: Virtual realities (VRs) and extended realities (XRs) in a myriad of forms have always been created within the realm of the arts. Like handprints on the cave walls of our prehistoric imagination, marks, images, and sounds are all potentially interactive experiences – moving from past to the present, back and forth in time, an ever present. In the Western tradition, advances in perspective and the development of a camera obscura during the Renaissance eventually led to chemical based analog photography and film technologies in the 19th century while electronic and digital forms evolved during much of the 20th century. Now in 2020, we are facing an extraordinary confluence of events: the Covid-19 pandemic, Black Lives Matter protests for racial justice, raging fires and devastating floods, all justifiably at the forefront of our consciousness. Earlier this year, which marks the 75th anniversary of the birth of the Atomic Age, the Board of the Bulletin of Atomic Scientists proclaimed that we are living in history’s most dangerous era as “Humanity continues to face two simultaneous existential dangers – nuclear war and climate change.” Projects focusing on these themes from a current exhibition, “Peter d’Agostino: A-Bombs / Climate walks” will be included in this presentation. They include excerpts from “VR / RV: a Recreational Vehicle in Virtual Reality” (1994/2020), “TRACES: virtual installation” (1995/2020), and “World-Wide-Walks / between earth & water / DESERTS” (2014).
Gabrielle Gutierrez “Reality or not – Here’s VR: Flexibility and presence in the classroom”
Utilizing VR technology as a tool for online education the Fox School of Business’ Online and Digital Learning Department is developing a VR app that will allow students to engage with their professor in an immersive VR environment. This web conferencing application promotes discussion and interaction between VR students and presenters by replicating a classroom environment that allows students in remote locations to participate in class discussions as they would on campus. Presenters using a live video stream will create an available source of information accessible to view in any location. For the student that prefers being in class, this application combines the flexibility of online learning with the live classroom interaction in a familiar educational environment. Our presentation covers steps that contributed to the development of the app and steps to consider when preparing for a new virtual classroom.
Chris McAdams: “Exploring the Unbuilt – A Past and Future of Architecture”
Traditional methods of architectural representation often leave viewers with a limited understanding of how spaces feel and function. Using new tools to visit unbuilt spaces and explore new possibilities for architecture and design, visionary projects of the past are brought to life and explored in ways previously unimaginable. These tools are also equally forward-looking and provide new grounds for immersive, integrated, and collaborative paths in the design process.
Hector Postigo “Disintermediating Immersion: Infrared mapping vs. controller interface and proprioception in VR Environments ”
While testing gameplay experiences in VR we observed changes in proprioception behaviors post immersion. That observation is consistent with existing video game and clinical applications. What distinguishes our system from existing work is that systems we reviewed use controller-based interfaces (CBI). However, our implementation uses an infrared-based interface (IRI) only, where an IR sensor attached to the head mounted apparatus maps extremities into the VR environment. We propose that sustained immersion in VR environments, using IRI can impact motor control more effectively, opening the door for expanding design paradigms for therapeutic VR interventions and immersive games.
Benjamin Seibold: Re-Shaping High-Performance Computing via Interactive Virtual Reality
Abstract: The traditional paradigm of high-performance computing (HPC) is to submit a computational task to a cluster; and once the job has run, use a local machine to visualize the simulation results. Then, devise a revised simulation and re-submit. Faculty and students in the Center for Computational Mathematics and Modeling explore how interactive virtual reality (VR) can re-shape HPC: while running a prototype simulation, results are visualized via VR in real time; and moreover, the user can interactively affect the computation while it is running. While this framework is ubiquitous in gaming, it is largely undiscovered territory in first-principles HPC. We showcase this ongoing research effort via an interactive simulation of partial differential equations on complex neuroscience geometries.
W. Geoffrey Wright: “What’s “up” with VR? How to stay balanced while immersed”
Standing up on two feet is a skill mastered by most babies within 1 year of life. In fact, bipedal upright stance is a highly complex skill that very few creatures on the planet can do well, and even humans only perform effectively if their central and peripheral nervous systems are intact and haven’t been compromised by overindulgence at a keg party. Staying balanced requires multisensory integration, meaning visual, vestibular (balance organs in the inner ear), and somatosensory (touch and kinesthesia) information needs to be combined and processed in the brain and used to guide stabilizing movements of the body to keep from falling. This is where virtual reality (VR) comes in. VR is a unique tool for testing balance, because it can easily be used to manipulate visual input such that it has a strong effect on visual-vestibular-somatosensory processing. Anyone who has ever been in a Star Wars movie, has probably experienced how real it can feel when just watching from the pilot’s seat of the Millennium Falcon as it flies through space during a fight with Empire. This amazingly immersive sense of self-motion, formally called “vection”, can affect one’s sense of up and down, one’s motor reactions, and one’s balance (even if sitting). In this talk, I will present examples from over the last century of research demonstrating how VR (and its predecessors) can change one’s sense of spatial orientation and self-motion and how this can be used to measure one’s neural health after brain injury or disease. While the video gaming industry has made it affordable to put VR in almost any household with a teenager, the applications for healthcare are vast and are growing as more and more VR research is done.