Consumer Data Privacy Rights: Emerging Tech Blurs Lines

emerging technologies alexa data collection

Data privacy is a fundamental right for Americans – but new emerging technologies like drone, IoT and facial recognition are introducing gray areas.

LAS VEGAS – From drones to facial recognition, new technology applications are introducing unique consumer privacy issues for civil society — and U.S. lawmakers and legal teams are struggling to keep up.

Privacy is a fundamental human right for consumers, but new ways in which data is collected and shared are blurring the lines of privacy; for instance, think Amazon Alexa collecting voice audio data in the home, or Ring smart doorbells obtaining video footage of passersby on the street.

“Privacy is a human right. This is acknowledged more and more every year and has changed the way that we treat data,” said Janet De Guzman with Opentext, speaking at ENFUSE 2019 in Las Vegas on Wednesday. “But citizens are now learning what those rights are, and are asking for those rights to be implemented. Law enforcement agencies need to respond.”

The level of private data available on smartphones — as well as who owns that data and who can lawfully collect it — was highlighted in the 2016 court order mandating that Apple help the FBI crack the San Bernardino shooter’s iPhone. But newer types of tech applications are only further underscoring the question of user privacy.

A 2018 murder case in Dover, N.H. – and more recently, one in Florida – shed a light on private data collected by Alexa voice assistant devices after Amazon was ordered with producing audio surrounding both incidents.

Amazon’s Ring meanwhile came under fire in 2019 after it was discovered the smart doorbell was partnering with more than 400 police departments across the country for neighborhood surveillance.

Worse, the expansion of data collection to public spaces is spurring new questions  around consent and ability for consumers to opt out of data collection. For instance, in 2013 London came under fire for hooking up “smart trash cans” across the city that displayed advertisements – but that were collecting signals from Wi-Fi enabled devices of passerby citizens, said De Guzman. Facial recognition in public places — such as a new pilot program around the White House — has also drawn questions and concerns around consent, data privacy and storage.

A History of Privacy Issues 

Security experts say that consumers should have fundamental data-privacy rights. If a company or government collects a consumer’s private data, it must have a legal basis, and must also have correct security measures in place to properly protect it. Individuals should also have the right to decide what and how their personal data is stored, experts like De Guzman have agreed.

enfuse 2019

Janet De Guzman discusses data privacy questions to consider

However, over the years court systems, companies and lawmakers have mulled over the data privacy implications of new tech, as exemplified through various legal cases throughout the years.

For instance, Riley v. California  shed light on warrantless searches of cell phones. In 2014, the Supreme Court ruled that the “search incident to arrest” exception does not extend to a cell phone, and police need to obtain search warrants to search cellphone data.

Another case, U.S. v Microsoft, in 2013 brought data ownership and privacy into the spotlight, after U.S. authorities tried to access customer emails through Microsoft from a data center housed in Dublin, Ireland as part of a U.S. trafficking investigation. Microsoft argued that that Irish authorities would need to give permission to obtain data stored in an Irish country (The U.S. Senate later approved a controversial cross-border data access act, dubbed the CLOUD Act that aimed to prevent such conflicting legal obligations).

These court cases show that “laws are slow to change, and rarely keep pace with such rapidly evolving technology,” said Herbert Joe, Attorney/Board Certified Forensic Examiner at Yonovitz & Joe. “The fourth amendment, unreasonable search and seizures, dates back to the early 1700s – but new tech is creating questions around what is a search and what is a probable cause.”

Fear of the Unknown

Lawmakers for their part are taking steps to enforce regulatory efforts for data privacy – but still have a long way to go.

The General Data Protection Regulation (GDPR), implemented in 2018, has paved the way for other privacy laws with tight regulations on companies regarding the collection of data – and have resulted in fines being imposed on Google, British Airways and more.

In the U.S., other regulatory efforts have been introduced aimed at data privacy – including  the California Consumer Privacy Act (CCPA), which goes into effect Jan. 1 in California, as well as the “Mind Your Own Business Act,” proposed in October, which threatens companies who violate privacy policies with monetary fines and executive jail time; as well as an October 2018 California law that bans companies from selling internet-connected devices with weak or default passwords, such as “Password” or “1234567.”

“In GDPR we have a number of principles around transparency and accountability – they apply not just in the EU – but we’re seeing similar requirements implemented in other countries,” said Paul Lanois, director of FieldFisher. “National data protection authorities are expected to coordinate enforcement powers … which will likely lead to a more pronounced enforcement impact.”

However, the “fear of the unknown” reigns when it comes to future privacy implications of  applications like facial recognition, drones and other emerging technologies.

Moving forward, manufacturers, law enforcers and governments need to consider an array of questions, including what personal data devices collect and process, what the intent of the collection is, how data is used and where it is sent — as well as how data is stored and how long it will be kept, privacy experts say.

Suggested articles