Samuel R Beighton
Partner
Co-lead of the Family Matters network
Article
12
Digital technology is evolving quickly, and innovative businesses often find themselves using data in ways that lawmakers have yet to grapple with. These new and emerging uses of data may expose customers, employees or other groups to risks for which the law offers firms no guidance – potentially putting the business in danger of public outcry.
Artificial intelligence (AI) is the most conspicuous example of this. Privacy and civil rights campaigners have long raised the alarm about the social risks of AI – in particular, the entrenchment of prejudice – but policymakers are still determining how best to regulate such dangers.
Business-to-business (B2B) data sharing has the potential to amplify and complicate these risks. When algorithms are trained using pooled data, for example, it may not be possible to identify and isolate embedded bias.
There are other ethical and reputational risks in B2B data sharing that aren't yet governed by law. Competition authorities around the world have identified data sharing as a potential threat to competition and consumer rights, but the available guidance is not definitive and cannot identify every possible concern.
In the context of gaps in law and regulations, organisations are increasingly adopting data ethics frameworks to avoid social harms and other negative consequences. This can help to not only reduce such harms, but also demonstrate to regulators and the public that efforts have been taken to do so.
Data ethics initiatives have often been launched in relation to AI, but they can also help to limit the potential harms of B2B data sharing projects. At the same time, businesses must ensure that their ethics frameworks are integrated with more concrete legal obligations.
Some risks associated with B2B data sharing, such as data protection, cyber security, fraud and discrimination, are already covered by laws and regulations. But when it comes to emerging uses of technology, practices that may be legal can still carry a reputational risk, observes Matt Hervey, partner and head of AI at Gowling WLG. "Something may be strictly legal and comply with regulation and even best practice, but it might still upset individuals and the general public," he explains.
Hervey cites the well-known example of US retailer Target, which promoted pregnancy-related products to a teenager, based on her purchasing patterns, which angered her father who was unaware that she was pregnant. In this case, the prediction was accurate and legal but aggravated customer expectations around data privacy.
Risks that could emerge from B2B data sharing, especially when combined with AI, include the danger that a third party is able to 'de-anonymise' anonymised data by combining it with other datasets, or that shared data is used to develop AI systems that have prejudicial outcomes.
An ethics-based approach allows businesses to limit reputational risks such as this, Hervey says. "Companies that act ethically avoid reputational risk, because they are deliberately taking steps to avoid social harm," he explains.
Such an ethics-led approach to data governance is also likely to chime with upcoming regulation, he adds. "Regulation is being developed on the use of data and AI and some of it has an ethical dimension around discrimination, bias and explainability."
As a result, a growing number of businesses are implementing – and, in many cases, publishing – ethical frameworks for their use of data. "Ethical frameworks are good PR, help businesses engage with relevant stakeholders and may even influence future regulation," observes Hervey.
Some companies are hiring data ethics officers to design and implement these frameworks and convening ethics committees to oversee them. German life sciences giant Merck, for example, has created a Digital Ethics Advisory Panel, consisting of technology, ethics and legal experts, to help implement its Code of Digital Ethics.
One of the toughest questions for any data ethics initiative is deciding what is ethical. In the context of AI, consensus is emerging, says Hervey. "On AI specifically, there's almost universal international agreement that ethical considerations should include bias, robustness, transparency and explainability, and the appropriateness of delegating significant decisions to machines," he explains.
With B2B data sharing, however, ethical conduct is less well-defined. "Deciding whether to use a dataset in relation to a product or service is quite difficult," Hervey says. "I don't think anyone foresaw the unintended consequences of aspects of social media, for example."
Any data sharing scheme should also consider how it will demonstrate that its use of data is ethical. Important factors include the terms on which data is shared, the purposes for which its use is allowed and monitoring systems to log who has accessed it. "Companies need to look at all the implications of data sharing – including cyber security, consents and the impact on data subjects."
A data ethics framework should not be static, Hervey adds – technology is always evolving, making its potential impact hard to predict. He recommends monitoring developments and considering ethical issues as they emerge.
Another risk that companies face when sharing B2B data is provoking competition concerns, says Samuel Beighton, a partner in Gowling WLG's EU, Trade and Competition practice.
"A key risk when businesses share data through bilateral arrangements, or through trade associations and consortia, is when [they] inadvertently share competitively sensitive information that raises concerns under the competition rules," he explains, adding that post-Brexit the UK is currently in the process of updating the applicable guidance in relation to information exchanges between competitors.
Beighton highlights concerns that B2B data exchange could also exclude new entrants to a given market. "If some industry players are engaged in an ongoing data exchange via a consortium or a private arrangement, how can other players get access to this data if it's needed to access the marketplace?"
Businesses sharing salary data have recently caught the attention of the UK's competition watchdog. "We recently had guidance from the Competition and Markets Authority addressing employers sharing information about salary bands and what they are prepared to pay for different roles," says Beighton. "If employers establish an understanding about a benchmark position in a sector with their competitors, they keep their costs down by reducing competition, but this arrangement is likely to break the law."
Therefore, when participating in B2B data-sharing schemes, businesses must check that they are not opening themselves up to scrutiny as regulators and competition lawyers recognise the influence of data on fair competition.
Applying an ethics-led approach to B2B data sharing in the context of competition can signal to regulators that a company is seeking to act as a "good corporate citizen", says Beighton. "Ethics have a broader spectrum than the legal test," he observes. "Indeed, an ethics-based approach [to data sharing] can be a real point of difference if the standard of that approach exceeds legal requirements."
However, viewing competition concerns through an ethics lens can sometimes blind businesses to their legal obligations, Beighton warns. "There is a risk that when companies think they are taking an ethics-based approach, they fall short of the relevant legal requirements," he says. "To rely on ethics alone raises the risk of a business convincing itself that a decision is ethical when it actually hasn't complied with the law."
He cites an investigation into cartel activity in the UK. A group of individuals at competing companies convinced themselves they were doing the right thing by agreeing to fix prices in order to protect their businesses and save employee's jobs.
"It was justified as a 'good thing', as it meant people didn't lose their livelihoods, but the cartel infringed competition law," he explains. "Companies were fined and people faced criminal prosecution."
Beighton is concerned that if companies rely on their own assessment of ethics without putting a legal framework around data sharing decisions, they may open up areas of risk they had not previously considered.
"When companies talk about ESG, sustainability and environmental impacts, that instinctively feels like they are charting an ethical path as a business, but there is a danger that they inadvertently allow this ethical narrative to justify certain behaviours," says Beighton.
"From a legal perspective, there is a risk that enthusiasm for that ethical journey might lead to conversations [e.g. with competitors] that wouldn't normally take place," he says. "Someone would say, 'Why are we talking about that with a competitor? We should check that before we discuss it', whereas if you're caught up in an ethical narrative you can overreach [and share too much]."
For these reasons, it makes sense for businesses embarking on an ethical framework for B2B data sharing to build in legal requirements from the outset. Hervey observes that for many businesses, data privacy is the starting point, as the context and skills are similar and data privacy and compliance officers are already trained to carry out impact assessments.
Beighton advises that organisations planning ethical frameworks for data sharing should keep one eye on the developing legal and regulatory environment. "It's important to look at changing legal requirements and take legal advice, because however wonderful it sounds, a desire to be ethical is not justification for competition law breaches."
To avoid increasing business risk, ethics-based data sharing "needs to go hand in hand with legal considerations, rather than letting an internally decided narrative on ethics steer the ship," says Beighton.
Want to learn more about B2B data sharing? Read the other articles in our Data Unlocked series:
If you have any questions relating to this article, please contact Matt Hervey or Samuel Beighton.
You can also sign up to our mailing list to receive future data-related articles.
CECI NE CONSTITUE PAS UN AVIS JURIDIQUE. L'information qui est présentée dans le site Web sous quelque forme que ce soit est fournie à titre informatif uniquement. Elle ne constitue pas un avis juridique et ne devrait pas être interprétée comme tel. Aucun utilisateur ne devrait prendre ou négliger de prendre des décisions en se fiant uniquement à ces renseignements, ni ignorer les conseils juridiques d'un professionnel ou tarder à consulter un professionnel sur la base de ce qu'il a lu dans ce site Web. Les professionnels de Gowling WLG seront heureux de discuter avec l'utilisateur des différentes options possibles concernant certaines questions juridiques précises.