At the intersection of virtual (VR), augmented (AR), and mixed reality (MR) research, the Multisensory Brain and Cognition Lab is committed to understanding how humans perceive, think, and interact in both physical and digital environments. Our team of researchers aims to contribute to the scientific knowledge base that will inform the development of the metaverse while considering its potential impact on human health and well-being.
Our research focuses on human perception, cognition, and action, utilizing advanced VR, AR, and MR technologies to study the intricate relationships between the mind, body, and the surrounding world. Through rigorous experimentation and data analysis, we aim to provide a solid foundation for the design and implementation of the metaverse.
Immersive technology is a rapidly evolving field that combines various technologies to create engaging and interactive experiences for users. It blurs the boundary between the physical and virtual worlds, enabling a sense of immersion in different environments.
VR is a technology that uses immersive, highly visual, and 3D simulations to replicate real-life situations or healthcare procedures. It is distinct from computer-based simulation due to its use of physical interfaces like keyboards, mice, voice recognition, motion sensors, and haptic devices.
AR is a technology that enhances user experiences by overlaying digital, computer-generated information on real-world objects or places.
MR is a space where the physical and virtual worlds co-exist, presenting real and virtual objects together within a single display.
Key research areas of the Multisensory Brain and Cognition Lab include:
Multisensory Integration: Investigating the neural and cognitive processes underlying the integration of information from various sensory modalities in real and digital environments, including the metaverse.
Spatial Awareness and Navigation: Examining the cognitive processes that allow us to navigate and orient ourselves in both physical and virtual spaces, informing the development of user-friendly and accessible metaverse experiences.
Human-Computer Interaction: Studying new and efficient ways for humans to interact with digital systems in immersive environments, with the goal of enhancing productivity and user experience within the metaverse.
Cognitive and Neural Rehabilitation: Exploring the potential of VR, AR, and MR technologies in developing personalized therapeutic interventions for individuals with cognitive and neurological impairments, focusing on improving their quality of life in the digital world.
Timing and Time Perception: Investigating how we perceive and process time in virtual and augmented environments, with implications for understanding the effects of the metaverse on our cognitive abilities and daily functioning.
In collaboration with researchers, clinicians, educators, and industry partners, the Multisensory Brain and Cognition Lab strives to advance the scientific understanding of human perception, cognition, and action in the context of the metaverse. Our goal is to contribute to the development of a metaverse that is seamlessly integrated into our lives, while taking into consideration the potential effects on human health and well-being.
We invite you to learn more about our research and join us in our pursuit of knowledge as we explore the frontiers of VR, AR, and R technologies. Welcome to the Multisensory Brain and Cognition Lab, where rigorous scientific inquiry informs the future of digital experiences
Dr. Michael Barnett-Cowan | Associate Professor | Department of Kinesiology and Health Sciences| BMH Building 1042 (office), TJB 1001-1003 (lab) | University of Waterloo | 200 University | Waterloo, Ontario N2L 3G1 Canada mbc@uwaterloo.ca
Copyright © 2023 Multisensory Brain & Cognition Lab - All Rights Reserved.
Powered by GoDaddy
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.