Autonomous Devices Are Changing the Meaning of Privacy
A network of autonomous agents (or “bots”) inside smart home devices, appliances, assistants and conversational agents, as well as mobile devices, drones, cameras, smart home IoTs, and other sensor-laden devices increasingly populate the physical space in which humans live and work. These devices observe and interpret everything we say and do, recording video and audio and uploading large amounts of data onto cloud networks. As we pass through their sensor detection range, they capture not only basic demographic data such as age, gender and race, but also location data, behavioral patterns, and even biometric data such as facial and vocal patterns, heartbeat, and other telemetry.
As much information as we knowingly or unknowingly share with them, they glean even more through algorithmic assumptions and prediction. Vast amounts of this data are assimilated into elaborate consumer profiles that are purchased by third parties in order to target information and market products to us. The impact to privacy has not gone unnoticed, but, unfortunately, we tend to view all privacy issues as information privacy issues and then try to solve them by applying “notice and consent” modalities of thinking that are primarily concerned with obtaining broad permissions from consumers to sell their personal information or behavioral data to third parties for marketing purposes.
Participants in this system have allowed this notion of information privacy and its associated notice and consent modality to define most aspects of the data privacy conversation, from its regulatory motifs to the design of the privacy setting user interfaces for giving or denying consent. Compounding this issue is the fact that, in privacy jurisprudence, people tend to be protected against privacy violations only when the intrusion is unreasonable or unexpected. The interplay of the notice and consent modality with the amorphousness of the “reasonableness” doctrine means that, over time, our “reasonable expectation” of privacy erodes as individuals give blanket permission for data gatherers to freely capture and use personal and behavioral information.
“Privacy of Personhood” – A Quaint Notion?
Robotic devices of increasing sophistication are also becoming more prevalent in our homes, workplaces, and public areas. Robots introduce qualitatively different privacy concerns. Because robots are living and working with us in physical space, their actions have effects on us that are similar to the effects other humans have. In this sense, many of the privacy problems emerging from widespread robotic device use are more akin to the traditional notions we have about privacy.
At Privaceum, we believe our platform makes dramatic strides forward in changing our notion of privacy in an increasingly un-private world.
We encode these ideas in legal concepts such as intrusion on seclusion, wiretapping, battery, and even common cultural norms like the amount of personal space one expects in public places and politeness in conversation. They represent the kinds of “privacy of personhood” that make a person comfortable sharing public and private spaces with robots. Asking robots to respect our privacy amounts to setting constraints on both sensor activation and telemetry as well as on activities and movements.
Questions to be addressed include: In what circumstances can a robot’s sensors record speech or video? How closely should a robot pass by a person in the home, in a public place, or other contexts? What manner of touch is appropriate for different people and in different contexts? What places are off-limits to robots when certain people are in them? How does a robot know what to do when an unknown person comes into its space for the very first time?
Moving Beyond “Consent”
Autonomous and robotic device privacy management is fundamentally dynamic and contextual. Unlike in web-based privacy models, people and robots are mobile—robots can move into different physical spaces inhabited by different people, and different people can enter or exit a device’s functional proximity at any time. Privacy expectations are also based on cultural norms, shared group values, and even on physical location. Sometimes, situational contexts such as an emergency will override all other concerns. Thus, any proposed solution must facilitate a common consistent standard that assists robots in acting in alignment with our contextually-informed values.
Privacy management becomes exponentially more difficult in real-world scenarios where devices must select appropriate governance actions to accommodate the potentially conflicting privacy needs of multiple people simultaneously occupying a home, workplace, or public space. Many of these individuals may be encountering a particular device for the first time. How does a device know what to do in situations where there are multiple people who might have different expectations about what should happen? A caregiver robot, for example, has to align the needs of its patient, who might have a more limited sense of personal privacy, with the other members of the household whose privacy expectations are much higher.
Devices will be required to dynamically navigate a matrix of complex privacy settings, customs, culture, and personal needs and, in some cases, the device may need to mediate conflicts between people to take effective action. Traditional notice and consent mechanisms, considered by many to be largely ineffective even within their own purview of web and app privacy, are likely to be completely insufficient when applied to autonomous device privacy management, which requires granular and scenario-specific restrictions on the range of actions a device can take in a wide variety of environments.
What if Privacy Were Re-centered on the Individual?
Instead of the privacy-eroding notice and consent paradigm, a new paradigm must evolve—one that re-centers itself on the individual—so that people and groups are able to make fine-grained choices on the range of privacy-impacting behaviors a device can take in a wide variety of environments. We need a mechanism that assists robotic/IoT devices to act in alignment with our contextually-informed values related to privacy and to other spheres of interchange.