Temple’s VR Day
Where: Eventspace 1st Floor Charles Library, VR Lab 3rd Floor Charles Library
When: Postponed until October 21st, 9:30-16:00
Supported by the Office of the Vice President for Research we will host the first VR Day at TU to showcase VR related work that is being done at Temple and begin a conversation of how we envision the future of VR at Temple.
After a keynote presentation by Kent Bye, panels, presentations, and lightning talks will show how VR is used for research and teaching. VR Demos are offered throughout the day in the VR Lab.
Kent Bye (Voices of VR): “Ultimate Potential of VR: Challenges & Opportunities for Higher Education”
Since May 2014, Kent Bye has conducted over 1500 Voices of VR podcast interviews featuring the pioneering artists, storytellers, and technologists driving the resurgence of virtual & augmented reality. He’s a philosopher, oral historian, & experiential journalist who has been helping to define the patterns of immersive storytelling, experiential design, and ethical frameworks for XR. He’s been asking pioneers about the ultimate potential of XR for the last six years, and he’ll be presenting an overview of the landscape of XR, an experiential design framework to help describe the unique affordances of XR, and some of the latest trends in immersive education. He’ll be exploring some of the challenges and opportunities for higher education including assessment, embodied learning, interdisciplinary collaboration, and accessibility. You can follow his work through the Voices of VR Podcast or on Twitter @kentbye.
Lectures & Lightning Talks:
Peter d’Agostino (Theater, Film & Media Arts): “VRs / XRs: arts & technologies, 1970s – 2020”
This presentation is a personal look at new media arts projects employing forms of human-machine interactivity that emerged in the 1970s and continue to the present. Themes of natural, cultural, virtual and extended realities are explored within a framework of a physically embodiment awareness, especially notable now as environmental boundaries are undergoing radical climatic change. Two projects will serve as focal points for these concerns. They are: VR/RV: a Recreational Vehicle in Virtual Reality (1993); and World-Wide-Walks / between earth & water / DESERTS (2016).
Olivia Given Castello (TU Libraries): ” ‘Experiencing what a future client may feel’: Using Empathy-based VR to Enhance Social Work Education”
Virtual reality has the potential to give students in helping professions experience in the virtual world before becoming practitioners working with clients in real life. Empathy-based interactive experiences go beyond typical VR simulations and, when supported by a structured curriculum, can help deepen student understanding of a topic. This lightning talk will discuss a social work student group’s recent activity with the Becoming Homeless VR experience at Temple Libraries, share student and professor feedback on the lesson, and make additional suggestions for using empathy-based VR to enhance student learning at Temple.
Jasmine Clark (TU Libraries): “Accessibility Approaches for Virtual Reality”
This presentation will share resources on designing accessible (for disability) virtual reality (VR) experiences. It will also share information regarding ways programs and courses utilizing VR can be more inclusive of disabled participants.
Lei Ma (Public Health): “Virtual reality therapy as adjunct to traditional physical therapy for a TBI patient who suffered a gunshot wound to the head: Case report”
Traumatic brain injury by gunshot creates a variety of unique sequelae that can be very challenging for clinicians to develop rehabilitation interventions. This case report presents an example of supplementing traditional physical therapy with virtual reality training for a patient who suffered a penetrating traumatic brain injury to the back of the head. By personalizing and modulating the virtual scenes to the patient’s deficits and tolerance for virtual reality exposure, the patient was able to progress in his rehabilitation, which had plateaued after traditional therapy alone. At the conclusion of his rehabilitation, the patient showed clinically meaningful improvements in functional mobility assessments and subjective self-reports.
Andy Maggio (Glimpse Group): “D6 VR: The Future of Data Analysis and Presentation”
VR opens a new frontier in how humans can interact with and process information. D6 VR is harnessing immersive technologies (VR/AR) and AI to allow users to extract better insights, faster from complex datasets, and then to present those insights in a clearer and more memorable fashion.
Chris McAdams (Tyler): “Exploring the Unbuilt – A Past and Future of Architecture”
Bora Ozkan and Gabrielle Gutierrez (Fox): “Reality or not – Here’s VR: Flexibility and presence in the classroom”
Utilizing VR technology as a tool for online education the Fox School of Business’ Online and Digital Learning Department is developing a VR app that will allow students to engage with their professor in an immersive VR environment. This web conferencing application promotes discussion and interaction between VR students and presenters by replicating a classroom environment that allows students in remote locations to participate in class discussions as they would on campus. Presenters using a live video stream will create an available source of information accessible to view in any location. For the student that prefers being in class, this application combines the flexibility of online learning with the live classroom interaction in a familiar educational environment. Our presentation covers steps that contributed to the development of the app and steps to consider when preparing for a new virtual classroom.
Hector Prostigo (Klein): “Disintermediating Immersion: Infrared mapping vs. controller interface and proprioception in VR Environments “
While testing gameplay experiences in VR we observed changes in proprioception behaviors post immersion. That observation is consistent with existing video game and clinical applications. What distinguishes our system from existing work is that systems we reviewed use controller-based interfaces (CBI). However, our implementation uses an infrared-based interface (IRI) only, where an IR sensor attached to the head mounted apparatus maps extremities into the VR environment. We propose that sustained immersion in VR environments, using IRI can impact motor control more effectively, opening the door for expanding design paradigms for therapeutic VR interventions and immersive games.
Benjamin Seibold (Science and Technology): “Re-Shaping High-Performance Computing via Interactive Virtual Reality”
The traditional paradigm of high-performance computing (HPC) is to submit a computational task to a cluster; and once the job has run, use a local machine to visualize the simulation results. Then, devise a revised simulation and re-submit. Faculty and students in the Center for Computational Mathematics and Modeling explore how interactive virtual reality (VR) can re-shape HPC: while running a prototype simulation, results are visualized via VR in real time; and moreover, the user can interactively affect the computation while it is running.
While this framework is ubiquitous in gaming, it is largely undiscovered territory in first-principles HPC. We showcase this ongoing research effort via an interactive simulation of partial differential equations on complex neuroscience geometries.
Anula Shetty (Klein): “Places of Power – Immersive VR Documentary”
Places of Power is an immersive VR documentary about the Fairhill neighborhood that invites viewers to experience North Philadelphia anew as a tapestry of knowledge, healing, and civic leadership. The neighborhood has suffered decades of depopulation and disinvestment. In recent years, every single high school has been closed. Eighty-six percent of the area’s households have incomes below the poverty line. These realities surmount to tremendous socio-economic challenges for the community. Created in collaboration with community residents, the project includes profiles of four remarkable North Philly change agents. Two of the community leaders, Nandi and Khalid Muhammed have run a penny candy store out of their living room for over 15 years, and use candy as a vehicle to teach children counting, Black history, art, and communication skills. They’re remarkable people who foster childhood magic and curiosity on an ordinary North Philly block. Mapping these nontraditional community access points is a key step to threading together the people and spaces that hold power in the neighborhood, but are often invisible. Through the project we celebrate and connect these places, thereby multiplying their public safekeeping power.
Alex Wermer-Colan (TU Libraries): “”Immersive Pedagogy: Developing a Decolonial and Collaborative Framework for Teaching and Learning in 3D/VR/AR”
In June 2019, a cohort of CLIR postdoctoral fellows convened Immersive Pedagogy: A Symposium on Teaching and Learning with 3D, Augmented and Virtual Reality at Carnegie Mellon University. The symposium sought to bring together a multidisciplinary group of collaborators to think through pedagogical issues related to using 3D/VR/AR technologies, as well as to produce and disseminate materials for teaching and learning. This talk presents the Immersive Pedagogy symposium as a model for interrogating and developing pedagogical practices and standards for 3D/VR/AR; we offer a decolonial, anti-ableist, and feminist pedagogical framework for collaboratively developing and curating humanities content for this emerging technology by summarizing the symposium’s keynotes, workshops, as well as its goals and outcomes. Workshops, keynotes, and participant conversations engaged with decolonial and feminist methodologies, practiced accessible design for universal learning, offered templates for humanistic teaching, and illustrated the possibilities of using 3D/VR/AR to extend critical thinking.
William Wright (Public Health): “What’s “up” with VR? How to stay balanced while immersed”
Standing up on two feet is a skill mastered by most babies within 1 year of life. In fact, bipedal upright stance is a highly complex skill that very few creatures on the planet can do well, and even humans only perform effectively if their central and peripheral nervous systems are intact and haven’t been compromised by overindulgence at a keg party. Staying balanced requires multisensory integration, meaning visual, vestibular (balance organs in the inner ear), and somatosensory (touch and kinesthesia) information needs to be combined and processed in the brain and used to guide stabilizing movements of the body to keep from falling. This is where virtual reality (VR) comes in. VR is a unique tool for testing balance, because it can easily be used to manipulate visual input such that it has a strong effect on visual-vestibular-somatosensory processing. Anyone who has ever been in a Star Wars movie, has probably experienced how real it can feel when just watching from the pilot’s seat of the Millennium Falcon as it flies through space during a fight with Empire. This amazingly immersive sense of self-motion, formally called “vection”, can affect one’s sense of up and down, one’s motor reactions, and one’s balance (even if sitting). In this talk, I will present examples from over the last century of research demonstrating how VR (and its predecessors) can change one’s sense of spatial orientation and self-motion and how this can be used to measure one’s neural health after brain injury or disease. While the video gaming industry has made it affordable to put VR in almost any household with a teenager, the applications for healthcare are vast and are growing as more and more VR research is done.