The writer is the lead consumer technology writer for NYT
This month, I’ve been using a new camera to secretly snap photos and record videos of strangers in parks, on trains, inside stores and at restaurants. (I promise it was all in the name of journalism.) I wasn’t hiding the camera, but I was wearing it, and no one noticed.
I was testing the recently released $300 Ray-Ban Meta glasses that Mark Zuckerberg’s social networking empire made in collaboration with the iconic eyewear maker. The high-tech glasses include a camera for shooting photos and videos, and an array of speakers and microphones for listening to music and talking on the phone.
The glasses, Meta says, can help you “live in the moment” while sharing what you see with the world.
You can livestream a concert on Instagram while watching the performance, for instance, as opposed to holding up a phone. That’s a humble goal, but it is part of a broader ambition in Silicon Valley to shift computing away from smartphone and computer screens and toward our faces.
Meta, Apple and Magic Leap have all been hyping mixed-reality headsets that use cameras to allow their software to interact with objects in the real world. This month, Zuckerberg posted a video on Instagram demonstrating how the smart glasses could use AI to scan a shirt and help him pick out a pair of matching pants. Wearable face computers, the companies say, could eventually change the way we live and work. For Apple, which is preparing to release its first high-tech goggles, the $3,500 Vision Pro headset, next year, a pair of smart glasses that look nice and accomplish interesting tasks are the end goal.
For the past seven years, headsets have remained unpopular, largely because they are bulky and aesthetically off-putting. The minimalist design of the Ray-Ban Meta glasses represents how smart glasses might look one day if they succeed (though past lightweight wearables, such as the Google Glass from a decade ago and the Spectacles sunglasses released by Snap in 2016, were flops). Sleek, lightweight and satisfyingly hip, the Meta glasses blend effortlessly into the quotidian. No one — not even my editor, who was aware I was writing this column — could tell them apart from ordinary glasses, and everyone was blissfully unaware of being photographed.
After wearing the Ray-Ban Meta glasses practically nonstop this month, I was relieved to remove them. While I was impressed with the comfortable, stylish design of the glasses, I felt bothered by the implications for our privacy. I’m also concerned about how smart glasses may broadly affect our ability to focus. Even when I wasn’t using any of the features, I felt distracted while wearing them. But the main problem is that the glasses don’t do much we can’t already do with phones.
Meta said in a statement that privacy was top of mind when designing the glasses. “We know if we’re going to normalize smart glasses in everyday life, privacy has to come first and be integrated into everything we do,” the company said.
I wore the glasses and took hundreds of photos and videos while doing all sorts of activities in my daily life — working, cooking, hiking, rock climbing, driving a car and riding a scooter — to assess how smart glasses might affect us going forward. Here’s how that went.
My first test with the glasses was to wear them at my bouldering gym, recording how I maneuvered through routes in real-time and sharing the videos with my climbing pals.
I was surprised to find that my climbing, overall, was worse than normal. When recording a climbing attempt, I fumbled with my footwork and fell. This was disappointing because I had successfully climbed the same route before. Perhaps the pressure to record and broadcast a smooth climb made me do worse. After removing the glasses, I completed the route.
This feeling of distraction persisted in other aspects of my daily life. I had problems concentrating while driving a car or riding a scooter. Not only was I constantly bracing myself for opportunities to shoot video, but the reflection from other car headlights emitted a harsh, blue strobe effect through the eyeglass lenses. Meta’s safety manual for the Ray-Bans advises people to stay focused while driving, but it doesn’t mention the glare from headlights.
While doing work on a computer, the glasses felt unnecessary because there was rarely anything worth photographing at my desk, but a part of my mind constantly felt preoccupied by the possibility.
Ben Long, a photography teacher in San Francisco, said he was skeptical about the premise of the Meta glasses helping people remain present.
“If you’ve got the camera with you, you’re immediately not in the moment,” he said. “Now you’re wondering, Is this something I can present and record?”
To inform people that they are being photographed, the Ray-Ban Meta glasses include a tiny LED light embedded in the right frame to indicate when the device is recording. When a photo is snapped, it flashes momentarily. When a video is recording, it is continuously illuminated.
As I shot 200 photos and videos with the glasses in public, including on BART trains, on hiking trails and in parks, no one looked at the LED light or confronted me about it. And why would they? It would be rude to comment on a stranger’s glasses, let alone stare at them.
Although the Ray-Ban Meta glasses didn’t make me feel more present or more safe, they were good at capturing a particular type of photo — the slice-of-life moments I wouldn’t normally record because my hands would be preoccupied.
But while these types of moments are truly precious, that benefit probably won’t be enough to convince a vast majority of consumers to buy smart glasses and wear them regularly, given the potential costs of lost privacy and distraction. - The New York Times
Oman Observer is now on the WhatsApp channel. Click here