Data privacy and the Internet of Things

Join

The following article is authored by Stacy-Ann Elvy.

 


 

The rapid rise of the Internet of Things (IoT) raises pressing privacy concerns despite its possible benefits. IoT devices can collect vast quantities of granular data about individuals’ daily habits and activities. The data that these devices can collect include consumption rate data, location data, and health-related data, among other things. IoT objects, such as smart rings and thermometers, have also been used in COVID-19 pandemic relief efforts (Eadicicco, 2020; Gardner, 2020). Companies’ traditional data business models have historically enabled “surveillance capitalism”, the goal of which is to “predict and modify human behavior as a means to produce revenue and market control” (Zuboff, 2015, p. 705, 2019). If left unchecked, the increasing volume and variety of IoT data in conjunction with traditional and nascent business models, may lead to an expansion of surveillance capitalism with even more far-reaching consequences.  

 

IoT devices can be used by various entities to colonize and obtain access to people’s homes and bodies while potentially decreasing their anonymity. This possible corporate colonization and surveillance may limit individuals’ ability to determine what happens to their information and may decrease their ability to shield themselves, their emotions and their daily activities from various actors. The IoT may also worsen pre-existing levels of unequal access to privacy and security. Data generated from individuals’ use of IoT devices and associated mobile applications combined with other sources of information could also be used to enable discrimination and negatively impact the opportunities that individuals receive. Historically marginalized groups may be more susceptible to data discrimination despite existing laws.

 

Individuals must often accept companies’ privacy policies in connection with purchasing and using IoT devices and services. These documents can authorize companies to use personally identifiable data for their own purposes, and transfer and disclose certain data to third parties. Even when such data are anonymized and aggregated, privacy and security-related risks may still arise. Some studies suggest that individuals can be identified in anonymized datasets and inferences about individuals may be gleaned from aggregated datasets (Felten, 2012; Ohm, 2010, pp.1703-05). In 2021, seemingly anonymized and aggregated cell phone location data purchased by a publication from a data vendor was used to publicly reveal a Catholic priest’s visits to gay bars and gay dating mobile application usage (Boorstein et al., 2021; De Chant, 2021). Personal data may also be disclosed and transferred to other entities in mergers and acquisitions and corporate bankruptcy proceedings. Increasingly large technology companies are acquiring smaller technology businesses. This consolidation of power and growing market dominance allows large technology companies to obtain copious quantities of data about individuals across multiple platforms and devices. This increases technology giants’ ability to obtain deeper insights into individuals’ habits and preferences. Data consolidation through cross-device and platform tracking may also increase data security risks.

 

Unlike other jurisdictions, the United States has historically adopted a sectoral approach to privacy protection in which distinct laws and regulatory bodies govern separate industries. The lack of a comprehensive federal privacy legislation can lead to regulatory gaps. There is a pressing need at the federal level to adopt comprehensive federal privacy and security legislation that provides baseline privacy rights to natural persons and that goes beyond the notice-and-choice model by providing guidance on permissible and impermissible data practices and additional protections for sensitive data, such as biometric identifiers.

 

At the state level, some states have been more active in the data privacy arena. The California Consumer Privacy Act grants several privacy rights to California residents and restricts companies’ ability to force individuals to waive statutorily granted rights by expressly providing that any such contractual provisions are invalid [1]. New technological developments, such as global privacy controls, may also allow individuals to more easily exercise their state granted privacy rights (Edelman, 2020). State privacy law efforts to provide more stringent privacy protections must also be drafted with existing federal statutory and constitutional limitations in mind. For example, the First Amendment may impact the validity of certain data transfer restrictions and an overly broad right to delete. Existing federal legislation may also pre-empt state law. For instance, the federal Children’s Online Privacy Protection Act pre-empts “inconsistent state law” and likely restricts states’ ability to adopt more privacy protective measures for covered children (Solove and Schwartz, 2018) [2]. It is unclear whether state privacy legislative efforts will be able to sufficiently address IoT privacy and security risks.  Privacy and security by design and default is an integral concept in the IoT setting.  Where possible, devices could be designed to function without always having to collect and transfer data and connect to the Internet.

 

Have you seen?
Data value – monetization, statistics, no privacy infringed
No trust, no data 
Treat data like you treat infants – signals and empathy are key
Data is an instrument – are we using it right? 

 

References:

 

Boorstein, M., Iati, M. & Shin, A., 2021. Top U.S. Catholic Church official resigns after cellphone data used to track him on Grindr and to gay bars. The Washington Post. Available at: https://www.washingtonpost.com/religion/2021/07/20/bishop-misconduct-resign-burrill/.

 

De Chant, T., 2021. Catholic priest quits after "anonymized" data revealed alleged use of Grindr. Ars Technica. Available at: https://arstechnica.com/tech-policy/2021/07/catholic-priest-quits-after-anonymized-data-revealed-alleged-use-of-grindr/?amp=1 [Accessed February 3, 2022]. 

 

Eadicicco, L., 2020. Coronavirus: Smart rings monitor body temperature to detect symptoms early. Business Insider. Available at: https://www.businessinsider.com/coronavirus-smart-ring-san-francisco-hospitals-covid19-symptoms-study-oura-2020-3?amp.

 

Felten, E., 2012. Is aggregate data always private? Federal Trade Commission. Available at: https://www.ftc.gov/news-events/blogs/techftc/2012/05/aggregate-data-always-private.

 

Gardner, S., 2020. Coronavirus: St. Augustine to use smart thermometers against COVID-19. St. Augustine. Available at: https://www.staugustine.com/news/20200331/coronavirus-st-augustine-to-use-smart--thermometers-against-covid-19.

 

Notes:


[1] Cal. Civ. Code §§ 1798.120, 1798.125, 1798.135,1798.192.
[2] 15 U.S.C.S. § 6502(d).
 

….

 

Stacy-Ann Elvy is a Professor of Law and Martin Luther King, Jr. Hall Research Scholar at the University of California, Davis School of Law. Her research focuses on “the commercial law of privacy” and its relationship to emerging technology, and human rights law.

 

The facts, ideas and opinions expressed in this piece are those of the authors; they are not necessarily those of UNESCO or any of its partners and stakeholders and do not commit nor imply any responsibility thereof. The designations employed and the presentation of material throughout this piece do not imply the expression of any opinion whatsoever on the part of UNESCO concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries.

Join