As smart glasses become more common, healthcare leaders are beginning to ask how the technology fits, or conflicts, with patient privacy expectations inside hospitals and health systems.
The question has gained urgency as consumer technology companies push smart glasses closer to the mainstream. This year, Meta released new versions of its Ray-Ban smart glasses featuring built-in displays and AI capabilities, signaling a broader shift toward hands-free, always-on wearable technology, according to CNBC and MIT Technology Review.
While the devices are marketed to consumers, some healthcare executives say their growing visibility raises difficult questions for hospitals and health systems that operate under strict privacy and regulatory requirements.
“People don’t necessarily know if they’re being recorded,” Julie Kline, chief human resources officer of Erie County Medical Center Corp., a Buffalo, N.Y.-based organization that operates a 573-bed academic medical center, told Becker’s. “And in healthcare, where privacy is everything, that’s alarming.”
Ms. Kline said her concerns were sparked after a human resources business partner raised questions about smart glasses following consumer sales promotions. The issue, she said, wasn’t tied to a specific incident but rather the realization that emerging devices could outpace existing workplace policies.
“I think it’s a reflection of how healthcare as a whole maybe isn’t proactively thinking about what’s going to happen in the future,” she said. “Whether it’s AI or something like this, I’m afraid we’re lagging behind the policy, procedures and security measures we need to ensure people’s privacy remains private.”
Smart glasses differ from smartphones in ways that complicate traditional device rules, particularly in clinical settings. Unlike phones, which are typically held in plain view, smart glasses sit at eye level and may capture audio or video in less obvious ways, a distinction that concerns some healthcare leaders.
Ms. Kline said existing policies may need to be revisited and possibly rewritten with worst-case scenarios in mind.
“When you’re creating policies, you have to think about the one person who’s going to do something wrong, not the 99.5% who are going to do the right thing,” she said. “Hope is a terrible strategy. Saying, ‘I hope this doesn’t happen,’ isn’t going to work.”
Learning from past technology risks
Ms. Kline drew parallels between smart glasses and earlier cybersecurity challenges, noting that healthcare organizations often strengthen protections only after a breach or high-profile incident.
“I’m afraid it’s going to take one explosive event for the rest of the industry to say, ‘Oh my gosh, I never thought about that,’” she said.
Her perspective is shaped by years of experience in HR executive consulting, where she was often brought in after organizations faced major policy or compliance failures.
“I’ve seen the panic that can come, the cost implications and the reputational damage that can come from organizations not having the right people focused on this,” she said.
Not just an IT issue
Ms. Kline emphasized that managing emerging technology risks should not fall solely on already-strained IT departments. Instead, she said smart glasses highlight the need for closer collaboration between human resources, information technology and security teams.
“Expecting an IT team to stay on top of this level of security, in addition to keeping the lights on and systems running, is naive,” she said.
At Erie County Medical Center, Ms. Kline said she works closely with the organization’s CIO to anticipate potential risks tied to workforce behavior, technology adoption and security.
“This is a topic where HR and IT need to be true partners,” she said. “Now is the time to do it.”
A more restrictive approach
While some leaders are still evaluating how smart glasses should be handled, other health systems say they have already taken a firm stance.
Britani Pinckard, chief information security officer at Baton Rouge, La.-based FMOL Health, said her organization has proactively updated its policies to address smart glasses and similar recording technologies.
“This is an issue that FMOL Health has been monitoring, and we have taken proactive strides to establish human resources and information technology policies that prohibit the use of unapproved smart glasses or recording devices in our facilities and work environments,” Ms. Pinckard told Becker’s.
The policies are designed to protect patient privacy, maintain regulatory compliance and uphold confidentiality standards in clinical settings, she said.
“Unauthorized recordings that may contain patient information potentially pose privacy risks,” Ms. Pinckard said.
For some organizations, prohibiting unapproved devices offers clarity at a time when wearable technology is evolving faster than formal regulatory guidance.
An uneven landscape
The contrast between concern and action reflects broader uncertainty across the industry. Ms. Kline said one of the most troubling signs, in her view, is how little discussion she sees about smart glasses within HR leadership circles.
“I’m not hearing about this in other HR spaces, and that scares the heck out of me,” she said, noting she has raised the issue through professional networks without receiving much response.
As smart glasses and other AI-enabled wearables become more visible in everyday life, healthcare leaders may increasingly be forced to confront where personal technology ends and institutional responsibility begins, and whether existing privacy frameworks are ready for what comes next.
The post Smart glasses spotlight privacy blind spots in healthcare appeared first on Becker’s Hospital Review | Healthcare News & Analysis.
Health IT
