From Discover Magazine:

Immersive environments — 3-D virtual reality worlds — can help us make sense of this [large amounts of data, and sophisticated algorithms] in a tangible way. Big data collects such a vast amount of information that it’s difficult to see patterns. Using computing power to translate data into something that can be seen and heard makes it easier to understand. “Scientists and engineers can work with their data, perceptually and intuitively, the way artists do,” says JoAnn Kuchera-Morin, creator of the AlloSphere. It is perhaps the most advanced of these immersive environments, housed on the campus of the University of California, Santa Barbara.

These electronically simulated worlds of sight and sound cut through a lot of the noise of big data, and they enable researchers to synthesize, manipulate and analyze large data sets in a way that is easier to comprehend and digest, providing unparalleled insights into the whole picture and how each individual piece fits in. “We have so much data that we need these bigger lenses to get a full picture of what’s really going on,” says Andrew Johnson, director of research at the Electronic Visualization Laboratory at the University of Illinois at Chicago. “These kinds of environments are lenses to look at data — the modern equivalent of the microscope or telescope.”

Pooling massive amounts of data allows patterns and trends to emerge that aren’t apparent in small, individual studies, and the applications are virtually infinite — think Moneyball, the 2003 best-selling book about how the perennially cash-strapped Oakland A’s used analytics and baseball stats to scout overlooked talent. Another example: In 2013, it took number-crunching algorithms, sifting through terabytes of data, to spot the distinctive signature of several Higgs boson particles. Physicists could finally identify them. Medical scientists, on the other hand, are crunching billions of data points culled from millions of patients about genetic mutations that make people more vulnerable to diseases like diabetes, heart disease and cancer. They combine this information with sequences of the proteins those bits of DNA produce. (Proteins are the body’s workhorses that control every cell.) This information is used to concoct more targeted therapeutics and more precise diagnostics using biomarkers — in a patient’s blood, saliva or urine — that signal the presence of a disease.

Immersive environments like the ones you’ll see in the following pages allow scientists to watch a tumor grow, observe molecules binding together — or even see a re-enactment of the Big Bang and witness the transformation of the universe over billions of years. Rudimentary versions of these environments have been around since the 1990s, but with today’s technology, scientists can sink into even greater realism and visualize more with sharper resolution. This immersion is used in disciplines as diverse as medicine, physics, neuroscience, green technology, structural engineering and archaeology at universities, government research agencies and in private industry all over the world.

“Originally, we created these as an educational tool for visualizing concepts and ideas — in place of a blackboard and hand waving — to help people see things they never did before,” says Thomas DeFanti, a research scientist at UC San Diego’s California Institute for Telecommunications and Information Technology, and a pioneer of virtual reality systems. “But the newest technology gives you the feeling of true immersion that makes for a completely riveting experience.”

Inside the AlloSphere

inside-the-allosphere

Paul Wellman

“Shall we, Matt?”

JoAnn Kuchera-Morin instructs her media systems engineer, Matthew Wright, to fire up the computer cluster that powers the AlloSphere. With a simple keystroke, we’re suddenly plunged into a virtual world of sight and sound that transports us on a fantastic voyage through a three-dimensional model of the human body. We hurtle down an artery, as if we’re sliding down a slippery chute, and nearly collide with the liver and heart. It feels as if we’re propelled, airborne and hovering in free fall in an onrush of images in the darkened chamber.

We’re wearing 3-D glasses and standing on a sturdy metallic catwalk suspended at the center of a 33-foot-diameter sphere, which seems to be floating inside a 2,000-square-foot room three stories high. Dozens of speakers and other audio equipment envelop us in sound from every direction, while high-resolution video projectors beam floor-to-ceiling images in 40-million-pixel detail. This all creates a unique 360-degree immersive environment that far outstrips the technology of other virtual reality systems. Here, researchers can use all of their senses to uncover new patterns in the data.

The AlloSphere cost $12 million for the structure alone and was completed in 2007. It is the brainchild of Kuchera-Morin, an orchestrally trained composer turned computer geek who directs the AlloSphere Research Laboratory at the University of California, Santa Barbara, perched on the rocky shoreline of the Pacific. A gregarious woman clad all in black with long, straight gray hair that makes her resemble a hippie grandmother, Kuchera-Morin began dabbling with big mainframe computers in the 1980s, when traditional instruments couldn’t translate the sounds she heard in her head into music.

“The computer helped me understand all of the acoustics, vibrations and physics of instruments,” she says. “And through mathematical equations, I could transform them into anything I wanted to.”

Her early experiments ultimately evolved into the AlloSphere, which converts reams of data into moving images and sound that are easier for researchers to comprehend and digest. Sometimes, dozens of scientists in data-rich disciplines ranging from neuroscience and medicine to green tech, theoretical physics, materials science and nanotechnology gather on this bridge. They use special wireless controllers and sensors embedded in the railings to maneuver through the constellation of images. Physicists can watch representations of electrons spinning inside hydrogen atoms, allowing them to actually “see” these invisible processes of nature, while neuroscientists can seemingly fly through 3-D images inside a patient’s brain. “Everything you see is a number that’s been crunched,” says Kuchera-Morin. “Mathematical algorithms can be translated into visual and audio frequencies by mapping their vibratory spectrum in the light and sound domain — like mapping heat through infrared light. The AlloSphere is a virtual instrument that allows scientists to do simulations, which will speed up time to discovery.”

On this particular day, we’re looking at a project by Jamey Marth, director of the Center for Nanomedicine at UC Santa Barbara. Marth is using the simulation version of the human body to examine the makeup and behavior of critical cell components, such as proteins, lipids (fats) and glycans (sugars). This particular simulation was built with MRI information collected from a living human body. Using specialized software and computational language to translate mathematical algorithms and scientific data into sight and sound, Kuchera-Morin’s band of techies first integrated the geometries of the arteries, veins, pancreas and liver, and then scaled them up like a high-powered digital microscope so researchers can better visualize the biological processes of health and disease.

Right now, Marth’s team is simulating the transport of chemotherapy directly to cancerous tumors in the pancreas and liver without harming healthy tissue. Artificial nanoscale particles might prove a good trucking device. But first, the researchers have to gauge if the organic nanoparticles can successfully navigate through blood vessels and then bind with cancer cells to deliver their toxic payload. In the AlloSphere, it’s as if Marth’s researchers are standing inside blood vessels, visualizing data on a human scale that is normally too small to see. The next step is to integrate fluid dynamics to simulate precisely how blood flows through arteries and veins. They’ll also work with materials scientists to create a reproduction that mirrors the composition of different-shaped nanoparticles to see how they navigate through the bloodstream so the team can run virtual tests of new treatments in nanomedicine.

“We need to design nanoparticles that will, like a lock-and-key mechanism, travel through the body and interact only with the diseased cell surface,” says Marth. “Right now, we use MRIs and PET scans to visualize theseprocesses, but other imaging approaches are needed — and that’s where the AlloSphere comes in. This is the breeding ground for the next generation of solutions in medicine.”

For more examples and stories about immersive environments, visit Discover’s original article here.

Source: Diving Into the Data, Literally