Data Ethics Practices in an Era of Digital Technology and Regulation: Panel Discussion
Thanks to advances in technology, the collection and analysis of personal data have become pervasive within businesses in recent years, allowing companies to better engage with customers as well as to drive growth and innovation.
But the ubiquitous use of data and technology to make determinations or predictions on individuals – from what movies we stream to whether we get a bank loan -- is also causing society to consider its ethical implications. A recent survey showed that 85% of consumers say it is important for organizations to factor in ethics in their use of AI, while the percentage of executives ranking AI ethics as important jumped from 50% in 2018 to 75% in 2021.
This means companies are having to balance commercial decisions with subjective ethical considerations when handling increasingly large amounts of data. Failing to uphold certain ethical standards can have significant impacts to their reputation, customer loyalty and ultimately revenues.
Robert Grosvenor, Managing Director and global lead for Privacy and Data Compliance Services with Alvarez & Marsal Disputes and Investigations practice in London, recently joined a panel of industry-leading experts to discuss how businesses are stepping up to this challenge at The Lawyer’s GC Strategy Summit. The main takeaways of the session are summarized below.
How to define data ethics?
While aspects of data ethics has its roots in privacy and data protection law, it also addresses broader societal issues, therefore encompassing principles from human rights, philosophy and corporate social responsibility. The Open Data Institute defines data ethics as “a branch of ethics that evaluates data practices to adversely impact on people and society – in data collection, sharing and use.”
One growing area of attention is the potential for bias in algorithmic decision-making. It refers to systematic and repeatable errors in algorithms systems that creates unfair outcomes by privileging one group of users over others, amplifying or even creating new forms of bias. Regulators and civil society are increasingly calling for greater algorithmic accountability and transparency. This includes a pioneer AI regulatory framework being drafted by the European Union that is likely to come into force in the first half of 2023.
Trust is another important pillar of the data ethics discussion. Expectations around business conduct have never been higher, making it imperative for companies to clearly articulate where they draw the ethical lines with respect to data use. Proper data ethics also helps build and solidify trust with customers, which translates into continued interactions with new products and technologies. Without this engagement, businesses may lose access to the very data sources that are intended to boost growth and profits.
Beyond purely financial indicators, understanding and addressing ethical considerations in relation to data is critical from an Environmental, Social and Governance (ESG) perspective too.
As part of the governance aspect of ESG, organisations have to maintain appropriate data governance controls including compliance with privacy and data protection regulations around the world. Increasingly though, the ethical implications of data collection and analysis fall within the social dimension of ESG, given the impact of these activities on individuals’ essential rights such as human dignity and autonomy.
What is the best corporate approach to data ethics and who is responsible?
Panellists agreed that taking a broad view of the impact of data usage – moving beyond compliance and embedding data ethics into their corporate data management practices – is a key first step.
Companies that set privacy and data compliance goals linked to potential harmful and ethical considerations of their impact are likely to be in a better position to explore the benefits of big data, maintain consumer trust and navigate regulatory, political and cultural pressures.
That said, it is still unclear for many businesses where responsibility for data ethics should sit within an organisation. A separate survey revealed that half of directors point to the CTO and CIO as primarily accountable for AI ethics, and barely half view them as a CEO-level responsibility.
While large global organisations, such as IBM or Vodafone, have been pioneers in embedding ethics into their existing business guidelines, companies at different levels of maturity around addressing digital ethics can adopt measures to start developing their stance in this area. These include:
• Consider creating a committee of cross-functional internal and external experts to assist in addressing complex ethical issues in the context of their own digital transformation strategy and specific data-driven initiatives
• Incorporate privacy and data ethics into ESG and link it to the wider company mission to retain visibility and receive the support from senior leadership
• Revisit and align the organisation’s values and standards to identify, understand and address how customer trust and ethical data management should be used to support current strategic and operational changes in business plans and technology innovation
• Build data ethics and privacy by design principles into the organisation as part of a long-term roadmap for digital success including appropriate awareness and engagement with senior management, business delivery partners, clients, industry bodies and policy makers
• Continue to monitor for adverse results including bias or discriminatory outcomes including the ongoing review of customer complaints and feedback and periodic health checks and audits of AI and analytics-based decision making
A&M: Leadership. Action. Results.
A&M’s privacy and data compliance practice focuses on supporting clients to navigate the evolving and complex data protection regulatory landscape to develop and implement solutions to address these challenges.
We offer specialist advisory and consulting services on international and cross-border privacy, data protection, secrecy and related laws and sectoral rules. Professionals within the practice include former consultants, regulators, data protection officers and certified information privacy professionals who are skilled at aligning and implementing complex regulatory requirements within operational processes and settings.