On a recent Friday afternoon, Kashif Hoda was waiting for a train near Harvard Square when a young man asked him for directions. Hoda was struck by the man’s nerdy, thick-framed glasses, but he did not realize that they were Ray-Ban Meta smart glasses and that a small white light indicated that they were recording.

A few minutes later, as Hoda’s train was pulling into the station, the bespectacled man, who was a junior at Harvard University named AnhPhu Nguyen, approached him again.

“Do you happen to be the person working on minority stuff for Muslims in India?” Nguyen asked.

Hoda was shocked. He worked in biotechnology, but had previously been a journalist and had written about marginalized communities in India.

“I’ve read your work before,” Nguyen said. “That’s super cool.”

They shook hands, but Hoda didn’t have time to continue the conversation because his train was boarding. He posted on social media, reflecting on how strange the encounter had been.

A month later, he found out just how strange. He had been an unwitting guinea pig in an experiment meant to show just how easy it was to rig artificial intelligence tools to identify someone and retrieve the person’s biographical information — potentially including a phone number and home address — without the person’s realizing it.

A friend texted Hoda, telling him that he was in a video that was going viral. Nguyen and a fellow Harvard student, Caine Ardayfio, had built glasses used for identifying strangers in real time, and had demonstrated them on two “real people” at the subway station, including Hoda, whose name was incorrectly transcribed in the video captions as “Vishit.”

Nguyen and Ardayfio, who are both 21 and studying engineering, said in an interview that their system relied on widely available technologies, including:

• Meta glasses, which livestream video to Instagram.

• Face detection software, which captures faces that appear on the livestream.

• A face search engine called PimEyes, which finds sites on the internet where a person’s face appears.

• A ChatGPT-like tool that was able to parse the results from PimEyes to suggest a person’s name and occupation, as well as look up the name on a people search site to find a home address, a phone number and relatives.

“All the tools were there,” Nguyen said. “We just had the idea to combine them together.”

The video makes it appear as if the system works instantaneously and consistently on everybody. But the process took a minute and a half, the students said, and worked on about a third of the people they tested it on.

Coding the system took just four days. “We spent most of the time making the video,” Ardayfio said.

The technology to put a name to a face is now free or cheap to use, so it is mostly a matter of ethics and propriety about whether to exercise the ability or not.

Nguyen and Ardayfio said they enjoyed doing random projects for fun and had recently created a flamethrower. That experiment singed Ardayfio’s leg, but it was the facial recognition system that blew up, metaphorically. Given how accessible face search engines are, they have been surprised by how much attention the project garnered around the world. Its main novelties were incorporating the ChatGPT-like assistant and the Meta Ray-Bans.

Meta has discussed creating similar facial recognition glasses — and even developed an early prototype — but has not released the capability publicly because of legal and ethical concerns. When the students’ video was first reported by 404 Media, a Meta spokesperson, Andy Stone, dismissed the company’s role via a post on Threads.

“What these students have done would work with any camera, phone or recording device,” Stone wrote. “And unlike most other devices, Ray-Ban Meta glasses have an LED light that indicates to people that the user is recording.”

Hoda did not notice the light.

Multiple investors have since reached out to the students, in messages shared with The New York Times, offering to fund further development of the glasses. Ardayfio said they had no desire to commercialize this particular extracurricular project and had simply wanted to show it was possible.

In a Google Doc accompanying their video, they encouraged people to remove their information from data broker sites that can reveal names, home addresses and contact information.

“We want people to learn to protect themselves,” Ardayfio said. He and Nguyen removed information from data broker sites that would expose their home addresses, but did not attempt to make their faces unsearchable.