[ad_1]
The MIT.nano Immersion Lab, MIT’s first open-access facility for augmented and virtual reality (AR/VR) and interacting with data, is now open and available to MIT students, faculty, researchers, and external users.
The powerful set of capabilities is located on the third floor of MIT.nano in a two-story space resembling a black-box theater. The Immersion Lab contains embedded systems and individual equipment and platforms, as well as data capacity to support new modes of teaching and applications such as creating and experiencing immersive environments, human motion capture, 3D scanning for digital assets, 360-degree modeling of spaces, interactive computation and visualization, and interfacing of physical and digital worlds in real-time.
“Give the MIT community a unique set of tools and their relentless curiosity and penchant for experimentation is bound to create striking new paradigms and open new intellectual vistas. They will probably also invent new tools along the way,” says Vladimir Bulović, the founding faculty director of MIT.nano and the Fariborz Maseeh Chair in Emerging Technology. “We are excited to see what happens when students, faculty, and researchers from different disciplines start to connect and collaborate in the Immersion Lab — activating its virtual realms.”
A major focus of the lab is to support data exploration, allowing scientists and engineers to analyze and visualize their research at the human scale with large, multidimensional views, enabling visual, haptic, and aural representations. “The facility offers a new and much-needed laboratory to individuals and programs grappling with how to wield, shape, present, and interact with data in innovative ways,” says Brian W. Anthony, the associate director of MIT.nano and faculty lead for the Immersion Lab.
Massive data is one output of MIT.nano, as the workflow of a typical scientific measurement system within the facility requires iterative acquisition, visualization, interpretation, and data analysis. The Immersion Lab will accelerate the data-centric work of MIT.nano researchers, but also of others who step into its space, driven by their pursuits of science, engineering, art, entertainment, and education.
Tools and capabilities
The Immersion Lab not only assembles a variety of advanced hardware and software tools, but is also an instrument in and of itself, says Anthony. The two-story cube, measuring approximately 28 feet on each side, is outfitted with an embedded OptiTrack system that enables precise motion capture via real-time active or passive 3D tracking of objects, as well as full-body motion analysis with the associated software.
Complementing the built-in systems are stand-alone instruments that study the data, analyze and model the physical world, and generate new, immersive content, including:
- a Matterport Pro2 photogrammetric camera to generate 3D, geographically and dimensionally accurate reconstructions of spaces (Matterport can also be used for augmented reality creation and tagging, virtual reality walkthroughs, and 3D models of the built environment);
- a Lenscloud system that uses 126 cameras and custom software to produce high-volume, 360-degree photogrammetric scans of human bodies or human-scale objects;
- software and hardware tools for content generation and editing, such as 360-degree cameras, 3D animation software, and green screens;
- backpack computers and VR headsets to allow researchers to test and interact with their digital assets in virtual spaces, untethered from a stationary desktop computer; and
- hardware and software to visualize complex and multidimensional datasets, including HP Z8 data science workstations and Dell Alienware gaming workstations.
Like MIT.nano’s fabrication and characterization facilities, the Immersion Lab is open to researchers from any department, lab, and center at MIT. Expert research staff are available to assist users.
Support for research, courses, and seminars
Anthony says the Immersion Lab is already supporting cross-disciplinary research at MIT, working with multiple MIT groups for diverse uses — quantitative geometry measurements of physical prototypes for advanced manufacturing, motion analysis of humans for health and wellness uses, creation of animated characters for arts and theater production, virtual tours of physical spaces, and visualization of fluid and heat flow for architectural design, to name a few.
The MIT.nano Immersion Lab Gaming Program is a four-year research collaboration between MIT.nano and video game development company NCSOFT that seeks to chart the future of how people interact with the world and each other via hardware and software innovations in gaming technologies. In the program’s first two calls-for-proposals in 2019 and 2020, 12 projects from five different departments were awarded $1.5M of combined research funding. The collaborative proposal selection process by MIT.nano and NCSOFT ensures the awarded projects are developing industrially-impactful advancements, and that MIT researchers are exposed to technical practitioners at NCSOFT.
The Immersion Lab also partners with the Clinical Research Center (CRC) at the MIT Institute for Medical Engineering and Science to generate a human-centric environment in which to study health and wellness. Through this partnership, the CRC has provided sensors, equipment, and expertise to capture physiological measurements of a human body while immersed in the physical or virtual realm of the Immersion Lab.
Undergraduate students can use the Immersion Lab through sponsored Undergraduate Research Opportunities Program (UROP) projects. Recent UROP work includes jumping as a new form of locomotion in virtual reality and analyzing human muscle lines using motion capture software. Starting with MIT’s 2021 Independent Activities Period, the Immersion Lab will also offer workshops, short courses, and for-credit classes in the MIT curriculum.
Members of the MIT community and general public can learn more about the various application areas supported by the Immersion Lab through a new seminar series, Immersed, beginning in February. This monthly event will feature talks by experts in the fields of current work, highlighting future goals to be pursued with the immersive technologies. Slated topical areas include motion in sports, uses for photogrammetry, rehabilitation and prosthetics, and music/performing arts.
New ways of teaching and learning
Virtual reality makes it possible for instructors to bring students to environments that are hard to access, either geographically or at scale. New modalities for introducing the language of gaming into education allow for students to discover concepts for themselves.
As a recent example, William Oliver, associate professor in electrical engineering and computer science, is developing Qubit Arcade to teach core principles of quantum computing via a virtual reality demonstration. Users can create Bloch spheres, control qubit states, measure results, and compose quantum circuits in an intuitive 3D representation with virtualized quantum gates.
IMES Director Elazer Edelman, the Edward J. Poitras Professor in Medical Engineering and Science, is using the Immersion Lab as a teaching tool for interacting with 3D models of the heart. With the 3D and 4D visualization tools of the Lab, Edelman and his students can see in detail the evolution of congenital heart failure models, something his students could previously only study if they happened upon a case in a cadaver.
“Software engineers understand how to implement concepts in a digital environment. Artists understand how light interacts with materials and how to draw the eye to a particular feature through contrast and composition. Musicians and composers understand how the human ear responds to sound. Dancers and animators understand human motion. Teachers know how to explain concepts and challenge their students. Hardware engineers know how to manipulate materials and matter to build new physical functionality. All of these fields have something to contribute to the problems we are tackling in the Immersion Lab,” says Anthony.
A faculty advisory board has been established to help the MIT.nano Immersion Lab identify opportunities enabled by the current tools and those that should be explored with additional software and hardware capabilities. The lab’s advisory board currently comprises seven MIT faculty from six departments. Such broad faculty engagement ensures that the Immersion Lab engages in projects across many disciplines and launches new directions of cross-disciplinary discoveries.
Visit nanousers.mit.edu/immersion-lab to learn more.
[ad_2]
Source link