The ‘Big data’ balancing act


Lauren Solomon is CEO of Consumer Policy Research Centre.

Big Data can help us develop and target social policies, but must be managed effectively and in accordance with community expectations, writes Lauren Solomon.

Everybody is trying to work out what ‘Big Data’, digital transformation and artificial intelligence really mean for our community.

As I sat down to write this article, I reflected on the broad number of researchers, businesses and advocates we’ve spoken to over the past year as we grapple with this issue. As a consumer organisation, it’s not common to spend so much time with computer scientists, privacy law scholars, coders, human rights advocates and data scientists.

In part, this broad engagement reflects the current structural shift occurring across society and the operation of markets.

The opportunities and challenges that lie ahead are broad and complex. The recent Consumer Policy Research Centre (CPRC) report Consumer Data & the Digital Economy warned of a growing chasm between current consumer data collection, sharing and use practices and community expectations.

In an era where erosion of trust is a daily headline, such a gap will only lead to growing instability, community frustration and potential knee-jerk policy intervention.

The US tech consultancy firm Gartner defines Big Data as ‘high-volume, high-velocity and/or high-variety information assets that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation’.

It’s this last point – ‘information processing that enables enhanced insight and decision-making’ – that fuels so much heated debate.

Data is now a major input to production. The collection, amalgamation and analysis of data to inform business and government decision-making, or the development and targeting of products and services, is growing rapidly.

It’s perhaps partly why ACCC Chair Rod Sims described the collection and management of our data as ‘one of the defining questions of our age’.

But as people, we don’t see it as ‘just data’ if it includes large swathes of personal information: about our friends, networks, preferences, location, shopping habits, music tastes, physical and mental wellbeing, even our genes or mouse movements as we navigate the web.

The collection of this data might seem harmless in isolation. But things get tricky when it’s amalgamated and used to make inferences or decisions about who we are, how we might behave, our health or our profitability as consumers.

The development of profiles or ‘scores’ of individuals based on this information raises fundamental questions about our right to privacy, the scope for discrimination, inequality, information asymmetry between buyers and sellers, and market power.

It’s a problem that requires experts across many disciplines to work together and share learnings amid the unrelenting drumbeat of technological advancement.

We need a paradigm shift

The most important thing we as a policy community can do is to ensure technology, data collection and use are reflective of community values.

Market research commissioned by the CPRC found Australians value their privacy, but currently don’t feel in control of how data is collected and shared.

European and Californian residents have some protection through the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), which deliver greater transparency and choice for consumers, and added protections for the processing of children’s data, the right to deletion, and easily accessible ‘do not sell my personal information’ buttons on websites.

But Australia has been slow to adopt similar protections.

We need to get the balance right. Policymakers and businesses must put consumers in control of their own data and personal information.

In CPRC’s view, this must be done in four key ways:

Transparency

One of the biggest barriers remains the lack of disclosure of what information is currently being collected, shared and used.

If consumers are going to be expected to make informed choices about their data, we absolutely need clear and transparent information about our options and the associated risks.

Information also needs to be comprehensible. Long-winded Privacy Policies aren’t good enough, when we know about 94 per cent of Australians don’t read all policies that apply to them. They’re simply too long, vague and complex.

Consent

For consent to be genuine and meaningful it must be provided expressly, specific to purpose, accessible, easy to understand, freely given and able to be withdrawn. (These components are spelled out in the EU laws.)

An important component of consent is that it also needs to be active, not passive or implied.

Cass Sunstein, one of the grandfathers of behavioural economics, recently observed that a so-called ‘nudge’

(a subtle change to the way decisions are framed and presented) is inadequate when it takes something away from someone without their express consent.

In practice, this calls into question whether businesses or governments using ‘opt-outs’ when collecting, sharing or using data and personal information are really gaining genuine consent.

Choice and control

Of those we surveyed who had managed to wade their way through at least one Privacy Policy or Terms and Conditions document over the past 12 months, more than two-thirds admitted they still signed up for products and services even though they didn’t feel completely comfortable.

For most, they did so because it was the only way to access the service. When there aren’t options, consumers are faced with a ‘take it or leave it’ situation. What people need are genuine options, clear choice and real control of their own data.

Greater protections for the vulnerable and disadvantaged

As a society, we may decide that certain kinds of data are simply too sensitive to be shared without extreme protections in place. We may also decide that some uses of data are simply inappropriate – especially if they lead to discrimination.

There’s a significant risk people will be excluded from certain products or services because of their data being used to classify them with profiles or ‘scores’.

At a minimum, effective regulation is critical to ensure inappropriate proxies or algorithms are not used to discriminate and exclude vulnerable people.

Big Data can absolutely provide society with benefits and innovations. It can make our lives better.

But we need to be bold enough to protect our values – and that means ensuring business practices and regulations remain reflective of those values in the digital age.