No trust, no data – how digital development depends on ethics

Join

The following article is authored by Frederik Weiergang Larsen and Ninni Gustavsen.

 


 

Amid fast technological and digital developments, data has become an essential and ever-increasing resource that provides great opportunities for our societies. However, a central discussion is how this data-driven development serves the common good, maximizes benefits and minimizes risks. Thus, there is a need to discuss how to increase awareness related to data ethics and to create incentives for developers, businesses, and public institutions to ensure a responsible and ethical use of data.

 

The COVID-19 pandemic has further illustrated the importance of this task. During lockdowns following the pandemic, many workplaces and educational institutions moved online, as well as communications in general. As a result, contact from one day to the next took place via online communications platforms, putting a great deal of responsibility on e.g., the selection of products made available to a company’s employees, and how these products and services are used. At the same time, the gathering and use of high-quality health data has been key to gaining information to contain, and act against the spread of the virus in society.

 

However, to successfully use the digital tools at hand, trust must be ensured, and the choice of technology, as well as the use of data, is of utmost importance. During the pandemic, the Danish Data Ethics Council made recommendations for the use of contact tracing apps, as well as a digital vaccination passport. The purpose was to ensure that data ethics is embedded in the digital services in order to secure trust that encourages widespread use and uptake. For the Digital COVID-19 passport, the council set five leading principles, namely 1) a clearly defined and limited scope, 2) transparency and voluntary use, 3) data security and privacy, 4) time limit on the storage of data, and 5) clear political accountability on using the digital COVID-19 passport.

 

The final app design includes many of these data ethics aspects. The app does not show any personal data – it only shows a QR code, if it is valid, or a message flagging that it is not. Furthermore, the app is limited in scope, with a planned sunset clause in the autumn of 2021. Thus, the app is meant to be a helpful, voluntary tool for citizens, and proved necessary for the controlled re-opening of society in the aftermath of a national lock-down in the winter and spring of 2020-21.

 

To fulfil the potential of data and digital solutions, people must feel safe in using them. Thus, Data Ethics is defined as going beyond existing legislation on privacy and data protection, such as the European General Data Protection Regulation, and addresses the responsible processing and use of data in technologies like artificial intelligence.

 

Challenges

 

When we as consumers navigate online, we leave digital footprints. Those are then used by companies in their businesses. It is important, especially for Small and Medium Enterprises (SMEs), to raise awareness on how to use data and digital technology in a safe and ethical way in view of creating trust in new digital business models and securing their competitiveness in the long run.  Trust is the most key component in ensuring a wide uptake of digital technology and solutions. Therefore, it is important to create a common framework where trustworthy use of data goes hand in hand with technological innovation and economic competitiveness. Transparency on data-based decision-making will empower consumers and businesses to choose products and solutions that are based on an ethical and responsible use of data. Thus, it important to create market incentives for companies to become more responsible, making data ethics a competitive advantage.

 

Legislation to ensure ethical use of data

 

The Danish government has adopted what is presumably the world’s first law on disclosure of Data Ethics Policy. The law affects Denmark’s biggest companies and how they conduct their annual reporting. The law requires companies with a data ethics policy to provide information on compliance, while companies without a data ethics policy must explain why they do not have a policy – much like they do today on CSR.

 

The overall aim is to encourage the largest Danish companies to reflect and act upon their general use of data and develop policies on data ethics.

 

The law entered into force on 1 January 2021, and the Danish Government has developed a guide for businesses on how to include data ethics in their annual reports, as there are no specific requirements that must be included in the annual reporting on Data Ethics.

 

The guide provides examples, including a description of the following considerations:

  1. What types of data are used and how are they provided?
  2. Which technologies, such as artificial intelligence, are used, how, and for what purpose?
  3. What are the data ethical considerations of personalization and segmentation of products and services? And on which parameters does the personalization take place?
  4. How and on what managerial level are decisions about the use of data and digital technology anchored, and how is continuous training of employees ensured?

 

TDC, Denmark’s largest telecommunication company, has been a frontrunner and have already included data ethics in their annual reports since 2018.

 

Related initiatives

 

In 2019, the Danish Government appointed a new council on data ethics. The Danish Ethics Council contributes to the debate regarding digital solutions, data, and AI.  To obtain the many advantages offered by the use of data, the Data Ethics Council seeks to support development in an ethical manner that takes into consideration the citizens’ fundamental rights, legal certainty, and fundamental values of society.

 

Also, the Danish government, along with a consortium consisting of the Confederation of Danish Industry, the Danish Chamber of Commerce, SMEdenmark, and the Danish Consumer Council, created a Joint Cybersecurity and Data Ethics Seal. The seal will be an independent labelling scheme given to companies that meet its requirements for cybersecurity and responsible handling of data. Applying the seal will tell consumers which companies handle data and AI in a trustworthy, ethical, and secure way. As a seal of approval, it will hopefully create a market incentive for actors to be more data ethical. The seal will be launched in September 2021.

 

Furthermore, The Danish Business Agency has created a data ethical toolbox to help companies, especially SMEs, implement data ethical policies and practices. The toolbox contains guidelines on how to include data ethics in their company’s code of practice, best practice case studies, and impact assessment tools on the responsible use of algorithms.

 

Denmark hopes that these initiatives will boost the ethical use of data and create transparency and sustained awareness about data ethics in business, both in Denmark and globally.

 

Have you seen?

  Data equity – there is no hiding

  Treat data like you treat infants – signals and empathy are key

  Invest in knowledge, use it to rebuild

  Partner on data to make it work for public good

 

….

 

Frederik Weiergang Larsen is a government official within the Danish Business Authority, Ministry of Industry, Business, and Financial Affairs, Denmark. He is mainly working with the global and European digital economy on issues regarding data ethics, trustworthy AI and uptake of digital technology by businesses.

 

Ninni Gustavsen is also a government official within the Danish Business Authority, Ministry of Industry, Business, and Financial Affairs, Denmark She works on Privacy and ethical use of data. Her focus is on privacy from a business perspective as well as data ethics.

 

 

The authors are responsible for the facts contained in the article and the opinions expressed therein, which are not necessarily those of UNESCO and do not commit the Organization.

 

Join