The season of holiday gift buying is upon us, and it can be hard to resist the coolest new tech gadgets. But not all items are created equal when it comes to privacy, experts say.
In the US, there are few limits on what companies can do with your data, putting the onus on us to do our homework, says Hayley Tsukayama, a senior legislative activist at the digital advocacy group the Electronic Frontier Foundation. She urges people to think through the privacy implications of gifts they’re giving to friends and family.
“Think about what information is going to be collected,” she said. “And how comfortable you are with that information potentially flowing to just anybody … [Companies] are certainly sharing [user data] and they don’t really have to tell you who they’re sharing it with or why.”
Such items might include “smart devices” that track our behavior, such as sleep and fitness trackers, as well as popular self-discovery tools such as DNA testing kits.
With the help of experts, we broke down the privacy implications of some of this season’s latest offerings – so you can give the gift of privacy.
Amazon Halo Rise
What it is: The Halo Rise is an Alexa-enabled sleep monitoring device that uses “silent, no-contact sensor technology” to track your “body movement and breathing to calculate sleep stages”. In other words, it knows when you’re sleeping and knows when you’re awake.
In addition to monitoring the movement of the person sleeping closest to it, the bedside tracker monitors the environment, making notes on “room temperature, humidity and light disturbances” to assess a person’s sleep quality. The device then connects to an Amazon-owned app called Halo and provides a “sleep score”.
Privacy implications: Amazon says there’s no camera or microphone on the device, so it isn’t recording audio or video. But all the data collected on your sleep pattern and environment is sent to Amazon’s cloud system for processing. The company’s privacy policy indicates users can easily delete and stop tracking their data, but it will be shared with third-party content providers in an anonymized, aggregated form “to help improve the content they provide for Halo members using their service”.
The Halo app also collects data about the way you use the app and the device, including whether you’ve plugged your device into a charger and “how often you open or access particular pages within the app, or whether you’ve used a particular feature”. Since this data isn’t end-to-end encrypted, it is also vulnerable to law enforcement requests.
“The proliferation of these devices, many of which have dubious connections to improving people’s health outcomes, normalizes a world where our movements and bodily functions are constantly under surveillance,” said Chris Gilliard, a visiting research fellow at the Harvard Kennedy School’s Shorenstein Center and a professor at Macomb Community College. “This kind of society will not be good for anyone, except the companies selling these technologies.”
Amazon Astro (coming soon)
For those looking for gifts in the new year, Amazon has hinted that this invitation-only product will be widely available in the coming months, but it already has experts worried. The “household robot for home monitoring” is a cross between Amazon’s voice assistant, Alexa, and its security camera, Ring, except it roams around and surveils the inside of your house. According to the product description, it is equipped with a camera, microphone and motion sensors that connect with Ring to “investigate activity”. It can also send alerts if “an unrecognized person” or “certain sounds” are detected. The robot also maps out parts of your home, such as where windows and doors are located.
Privacy implications: There have long been concerns about Ring’s work with law enforcement, Alexa devices recording people’s conversations even when not asked to, as well as the failure of existing facial recognition systems to recognize Black and brown faces. The combination of all these functions in one mini robot has privacy advocates on high alert. For instance, a mini-robot owner can set it to automatically alert them if someone “unrecognized” is detected. But there’s a chance the facial recognition system could wrongly identify a person as a stranger, and if the product is connected to a person’s Ring account, it could send a Ring alert and store footage on a Ring server, where it could become open to law enforcement requests.
Police have accessed user video footage 11 times in 2022 without consent, according to a letter Amazon sent to the Democratic Massachusetts senatorEd Markey.
What the company says: An Amazon spokesperson, Katie Stafford, said “privacy and security are foundational” to how the company designs every device. “We are dedicated to providing customers with transparency and controls over their experience, making privacy controls incredibly easy to use and understand, and keeping customer information safe,” she said in an emailed statement.
Google Nest Hub
What it is: There are several generations of this smart home, voice-enabled device from Google, which comes with a touch screen and built-in cameras from Google’s home-security brand, Nest, and is connected to Google Assistant. The functions can vary slightly between versions but generally include performing searches, playing YouTube videos, enabling purchases and camera and audio monitoring of your home, and even monitoring your sleep. The second-generation Nest Hub includes a proximity sensor, to detect when something or someone is near it.
Privacy implications: The thing to remember about Google is that its main revenue stream is advertising, and collecting information about you is the best way to target those ads. In addition, the company receives tens of thousands of requests from government agencies every year because of the vast amounts of data it maintains on users.
With the Google Nest Hub, the company says that while it doesn’t harvest data from your audio recordings itself, it may use the transcript of your recordings to inform what ads you see. There is a physical button to shut off the microphone and you can also turn off the camera, if you choose. However, according to the company’s product description, when the mic and camera are on they can do everything from detecting an unrecognized person to listening for sounds like breaking glass or snoring.
And with Google facing scrutiny for its data retention practices on sensitive topics such as abortion, it is wise to consider whether you’re comfortable having your search, purchase or communication history collected and stored.
“We shouldn’t ignore everything that Google is putting out in the home surveillance space,” said Albert Fox Cahn, the founder of the advocacy group Surveillance Technology Oversight Project. “In many cases, it can even go farther than what Ring is offering.
“I don’t want Google to know every time I get in and out of bed,” he added.
What the company says: A Google spokesperson, Evan Barbour Grippi, reiterated the many ways users can turn off the microphone and cameras and delete their data, including facial and audio data. She said Face Match and sleep monitoring were off by default and that sleep data was not used for ad personalization. She added that Google Assistant devices were always in “standby” mode, listening for an activation such as “hey, Google”, but audio recorded in standby mode was not shared with the company. “After Google Assistant detects an activation, it exits standby mode and sends your request to Google servers. This can also happen if there is a noise that sounds like ‘Hey Google’ or an unintended manual activation,” Grippi said in an emailed statement.
What it is: Fitness trackers like Apple Watch and the Google-owned Fitbit are best known for their activity and health monitoring capabilities, while location trackers like AirTags are small devices you can attach to your keys or other belongings to keep track of them. While their functions are different, we’re including them in one category for the purposes of this list because both have various location-tracking functions.
Privacy implications: Experts are generally worried about the ability of other people and law enforcement to gain access to your location through any device that keeps track of it. Fitness trackers and AirTags travel with you – or your things – and can paint a clear map of all the places you’ve been. However, Apple has made it easier to secure that data through an opt-in feature called “advanced data protection for iCloud”, which makes all data backed up to the iCloud, including Watch location information, end-to-end encrypted. In addition to enabling that feature on Apple devices, you can turn off location services on Apple Watch entirely. As for Google, the company says you can easily delete your Fitbit data. But when enabled and available, location information can be subject to law enforcement requests if it’s not end-to-end encrypted.
“We’ve been raising the alarm bell about the ways that Apple makes it possible for police to legally track our locations using various location tracking surfaces,” Cahn said.
Apple AirTags have raised concerns over their potential to be used in dangerous ways. According to news reports and a recent lawsuit, stalkers have attached AirTags to unsuspecting people in an effort to keep tabs on their locations. And despite changes Apple has made to the device to make it easier to find out whether an AirTag that doesn’t belong to you is somewhere on your person, a new lawsuit alleges those changes have been insufficient.
What the company says: Apple declined to comment but directed the Guardian to several websites including information about its Safety Check feature, which lets a user “quickly stop sharing” their information if they feel their personal safety is at risk, as well as the updates the company rolled out in response to AirTag stalking concerns.
DNA testing kits
What it is: The possibility of unlocking secrets about our health or ancestry means companies like 23andMe, FamilyTreeDNA and AncestryDNA are as popular as ever. But just how private is sharing a DNA sample?
Privacy implications: These companies handle arguably the most personal information about you, and there is a real possibility that information could be shared with third-party advertisers, researchers and even law enforcement. For instance, Ancestry and 23andMe share anonymized data from consenting users with researchers. 23andMe, for its part, said it vetted all vendors for privacy and security safeguards including whether they respond to law enforcement requests.
Ancestry companies cooperating with law enforcement requests made headlines when police identified the Golden State Killer using genealogy websites. And the practice has become fairly common: in 2019, FamilyTreeDNA admitted to sharing its data with the FBI without users’ consent, for instance.
While these companies say they only respond to “valid” legal requests and 23andMe says it fights these requests off when it can, in many cases there’s only so much a company can do to ward off government efforts to get user data. So be aware that using these kits can result in your DNA, and thus information about not just you but also relatives, landing in the hands of government agencies.
What the companies say: Ancestry said customer privacy was the highest priority, according to an emailed statement from a spokesperson, Gina Spatafore. “Ancestry does not share our customer’s DNA data with insurers, employers, or third-party marketers. Ancestry does not voluntarily cooperate with law enforcement unless compelled to by valid legal process, such as a court order or search warrant,” Spatafore said. “Ancestry has provided customers with the voluntary option to participate in a small number of academic and scientific research studies to contribute to scientific discoveries – as publicly disclosed on our website.”
23andMe said it “does not partner or otherwise work with any law enforcement agencies”, according to an emailed statement from Jacquie Cooke, general counsel and privacy officer. “23andMe will exercise any available legal measures to object to a law enforcement request,” the statement reads. “To date, we have not released any individual customer data to law enforcement.”