“What I really love about this concept of spatial computing is we’re not talking about the metaverse,” Sheryl Sleeva, Director of the Center for Entrepreneurial Leadership and Professor of Entrepreneurship and Innovation at Gordon College, said. “People get very dismissive right off the bat when you start talking about the metaverse. But that’s not how this is rolling out.”
Sleeva defines spatial computing simply: it’s a way to integrate both the physical and digital environment in real time, enabling interaction with digital objects as if they existed in physical space. Built upon AR, VR, mixed reality, AI, sensors and 3D engines, the technology has moved well beyond the experimental phase in several key industries.
From factory floors to operating rooms
The earliest and most compelling use cases are concentrated in industries where the stakes are highest: healthcare, manufacturing, engineering, defense and complex training environments.
That’s according to futurist Daniel Burrus, CEO of Burrus Research and New York Times bestselling author, who describes spatial computing’s core power as its ability to transform how humans interact with digital information in physical environments. He offered the example of remote surgery where a surgeon in one city can use augmented reality to guide a nurse practitioner in a rural community through a complex procedure, assisted by a surgical robot.
“By the way, this is already happening,” Burrus said.



















