Menu Bar

Home           Calendar           Topics          Just Charlestown          About Us

Friday, October 29, 2021

An essential tool for making bootleg or peeping tom videos

Can Facebook’s smart glasses be smart about security and privacy?


Is he looking at you or looking at personal information about you?
 CSA Images via Getty Images
Facebook’s smart glasses ambitions are in the news again. The company has launched a worldwide project dubbed Ego4D to research new uses for smart glasses.

In September, Facebook unveiled its Ray-Ban Stories glasses, which have two cameras and three microphones built in. The glasses capture audio and video so wearers can record their experiences and interactions.

The research project aims to add augmented reality features to smart glasses using artificial intelligence technologies that could provide wearers with a wealth of information, including the ability to get answers to questions like “Where did I leave my keys?” Facebook’s vision also includes a future where the glasses can “know who’s saying what when and who’s paying attention to whom.”

Several other technology companies like Google, Microsoft, Snap, Vuzix and Lenovo have also been experimenting with versions of augmented or mixed reality glasses. Augmented reality glasses can display useful information within the lenses, providing an electronically enhanced view of the world. For example, smart glasses could draw a line over the road to show you the next turn or let you see a restaurant’s Yelp rating as you look at its sign.

However, some of the information that augmented reality glasses give their users could include identifying people in the glasses’ field of view and displaying personal information about them. It was not too long ago that Google introduced Google Glass, only to face a public backlash for simply recording people. Compared to being recorded by smartphones in public, being recorded by smart glasses feels to people like a greater invasion of privacy.

As a researcher who studies computer security and privacy, I believe it’s important for technology companies to proceed with caution and consider the security and privacy risks of augmented reality.

Smartphones vs. smart glasses

Even though people are now used to being photographed in public, they also expect the photographer typically to raise their smartphone to compose a photo. Augmented reality glasses fundamentally disrupt or violate this sense of normalcy. The public setting may be the same, but the sheer scale and approach of recording has changed.

Such deviations from the norm have long been recognized by researchers as a violation of privacy. My group’s research has found that people in the neighborhood of nontraditional cameras want a more tangible sense of when their privacy is being compromised because they find it difficult to know whether they are being recorded.

Absent the typical physical gestures of taking a photo, people need better ways to convey whether a camera or microphone is recording people. Facebook has already been warned by the European Union that the LED indicating a pair of Ray-Ban Stories is recording is too small.

In the longer term, however, people might become accustomed to smart glasses as the new normal. Our research found that although young adults worry about others recording their embarrassing moments on smartphones, they have adjusted to the pervasive presence of cameras.

Smart glasses as a memory aid

An important application of smart glasses is as a memory aid. If you could record or “lifelog” your entire day from a first-person point of view, you could simply rewind or scroll through the video at will. You could examine the video to see where you left your keys, or you could replay a conversion to recall a friend’s movie recommendation.

Our research studied volunteers who wore lifelogging cameras for several days. We uncovered several privacy concerns – this time, for the camera wearer. Considering who, or what algorithms, might have access to the camera footage, people may worry about the detailed portrait it paints of them.

Who you meet, what you eat, what you watch and what your living room really looks like without guests are all recorded. We found that people were especially concerned about the places being recorded, as well as their computer and phone screens, which formed a large fraction of their lifelogging history.

Popular media already has its take on what can go horribly wrong with such memory aids. “The Entire History of You” episode of the TV series “Black Mirror” shows how even the most casual arguments can lead to people digging through lifelogs for evidence of who said exactly what and when. In such a world, it is difficult to just move on. It’s a lesson in the importance of forgetting.

Psychologists have pointed to the importance of forgetting as a natural human coping mechanism to move past traumatic experiences. Maybe AI algorithms can be put to good use identifying digital memories to delete. For example, our research has devised AI-based algorithms to detect sensitive places like bathrooms and computer and phone screens, which were high on the worry list in our lifelogging study. Once detected, footage can be selectively deleted from a person’s digital memories.

X-ray specs of the digital self?

However, smart glasses have the potential to do more than simply record video. It’s important to prepare for the possibility of a world in which smart glasses use facial recognition, analyze people’s expressions, look up and display personal information, and even record and analyze conversations. These applications raise important questions about privacy and security.

We studied the use of smart glasses by people with visual impairments. We found that these potential users were worried about the inaccuracy of artificial intelligence algorithms and their potential to misrepresent other people.

Even if accurate, they felt it was improper to infer someone’s weight or age. They also questioned whether it was ethical for such algorithms to guess someone’s gender or race. Researchers have also debated whether AI should be used to detect emotions, which can be expressed differently by people from difference cultures.

Augmenting Facebook’s view of the future

I have only scratched the surface of the privacy and security considerations for augmented reality glasses. As Facebook charges ahead with augmented reality, I believe it’s critical that the company address these concerns.

[Over 115,000 readers rely on The Conversation’s newsletter to understand the world. Sign up today.]

I am heartened by the stellar list of privacy and security researchers Facebook is collaborating with to make sure its technology is worthy of the public’s trust, especially given the company’s recent track record.

But I can only hope that Facebook will tread carefully and ensure that their view of the future includes the concerns of these and other privacy and security researchers.

This article has been updated to clarify that future Facebook augmented reality glasses will not necessarily be in the Ray-Ban Stories product line and that, while the company’s goals include identifying people, the Ego4D research data was not collected using facial recognition technology.The Conversation

Apu Kapadia, Professor of Computer Science, Indiana University

This article is republished from The Conversation under a Creative Commons license. Read the original article.