Privacy As Enabling Technology
Recently, Google demonstrated a new smart glasses technology. In the demo, they showed how these smart glasses could “break down communication barriers” by instantaneously translating communications and displaying what the other person is saying in the wearer’s native language. This would allow Mandarin speakers to, for example, communicate effortlessly with English speakers and would allow hearing-impaired persons to see a transcription of what others are saying to them. Pretty freakin’ bueno. But …
For the Google smart glasses to work, they have to either natively process the language or, more likely, upload the voice file to the cloud for processing. From a privacy standpoint, this means that a record is being kept of everything that’s said. If you think rummaging through emails for electronic discovery is tough, imagine if every employee at your company had everything they said transcribed and stored! From a legal perspective, these stored files would have to be produced in litigation, saved for production and evaluated for relevance. One more database to have to maintain and protect. Personal information in these conversations, names, addresses, health information, SSNs, would also have to be protected.
The technology also implicitly records the communications (conversations) of third parties who likely never expressly consented or implied consent to the recording of their conversation. Recently, Amazon was the subject of a class-action lawsuit alleging that its Echo (Alexa) devices—with its “always listening” technology—was unlawfully “intercepting” and listening in on conversations (and these conversations have been subpoenaed by police in various criminal investigations.)
While the court ruled that the owners of the devices had consented to the recordings, the other people in the room did not. Moreover, with regard to the smart glasses, the voices database that is uploaded to Google can be parsed and used by Google for marketing, sales, profiling or other purposes—either because in order to use the glasses you must consent to the database analysis or, well, just because. Like most “free” things online, if it’s useful and you’re not paying for it, you’re the product. This means that the data streams can be hacked, the stored data can be hacked and, even if the data is anonymized, it can potentially be de-anonymized.
I love the idea of smart glasses. I want them and I want to use them. Even with the privacy implications (like always-on cameras, instantaneous translation of street signs or books, facial recognition and database linkage), I think it’s pretty cool to move the data I need from my hand to my eye.
I love the idea of smart glasses. But the privacy and security implications terrify me.
And this is how privacy and security can be enabling technologies. If we embed respect for privacy and hard-core security into these kinds of products—with transparency and enforceability—we can encourage adoption. Without that privacy and security, we are asking people to put surveillance devices on their faces and just trust us. With privacy and security built-in, I’d be the first in line to buy the product. Without it, count me out. Capisce? (Translated from Italian).