People may consent to the use of their personal data, without being sufficiently aware or informed of the nature and extent of potential implications.

While most people value privacy in general, many readily consent to the use of their data so they can continue using certain digital platforms. For example, TikTok and WeChat have consistently shown that despite intense government scrutiny over security fears – both apps are still favoured by the majority of their users.

“When you go online, you basically want to get information. So, most users will click on a link because they want the information more than they are concerned about their personal privacy,” says UNSW Professor of Law, Leon Trakman.

But very often, people don’t really know exactly what they are agreeing to.

“Some of them do not care enough to read through convoluted privacy policies – and will justify this by saying it takes too long to do so, or they would not understand anyway. Others will respond that there is no information about them online. But in many cases, an astute hacker will find that information.

“The dilemma for internet regulators is to prevent data users from eroding the privacy of consumers, while recognising that data providers lose money when consumers decline to access websites informing them about the consequences of consenting to the use of their personal data,” Prof Trakman says. 

The importance of consent in data protection can also not be underestimated.

Consent is the basis for data subjects allowing a data controller to collect, process and use their personal data. But consent to the use of personal data has not only a legal, but also a cognitive dimension, Prof Trakman says.

In other words, data consumers may consent to the use of their personal data, without being cognitively aware and sufficiently informed of the nature and extent of that use by data providers. Data providers then exploit this cognitive asymmetry for their economic benefit, at the expense of consumers’ privacy.

What can be done to better protect people’s privacy?

Prof Trakman argues the need for greater legal consistency and harmonisation in the law governing consent to the use of personal data.

Cognitive deficiency arises for two reasons:

  • Data users often do not adequately alert or inform data consumers about the nature and extent of the prospective use of their personal data.
  • Data consumers often fail to take the time to understand the legal and practical consequences of consenting to that use.

“This lack of explanation given to data consumers, coupled with their limited understanding of the impact of the use of their data, is readily reflected in regulators seeking to safeguard the consent of minors,” Prof Trakman says.

Can government intervention help?

Most governments – certainly in the US and to some extent in Australia – tend to take a very cautious view when it comes to data protection.

“They don’t want to interfere with an industry that is doing extraordinarily well. And you can see today with COVID-19; the industries that have done most well have been IT companies.

“The government doesn’t want to interrupt a source of revenue, or upset very powerful companies. And in effect, sometimes they can't,” Prof Trakman says.

“We’ve got traditional cases being brought against Facebook and Google for violating privacy rights and Google just ignores us. Facebook just ignores us because they have nothing here in Australia that can be used to pay fines or compensation to victims. Their headquarters are in the US and in Ireland.”

This is mainly because social media platforms have access to a pool of data that they can collect and analyse to determine which advertisements or information you’ll be exposed to.

“There are all sorts of issues about information being sold for the purpose of spying. We also have information that elections are rigged based on the data that has been sold. Even large companies such as Facebook have some serious charges brought against them for providing personal data,” Prof Trakman says.

“They can transfer your data in bulk, just as Facebook did in providing personal data to Cambridge Analytica in a scandal linked to Trump’s election campaign in 2016. The information can have great value in assessing what strategies are most likely to win people over in marketing in general. Personal information relating to someone’s age, religion, sex, race, geography can be used for vote rigging, or in many cases, can be used against them.”

How far can regulations go to protect the consumer?

People need to know how their data will be used, Prof Trakman says.

“Most of us don't understand how complex the range of possibilities are once our data has been distributed either through our consent or in the absence of how it could have been used.

“And then you have people who are actively looking for information to blackmail you. Or they pretend to be you in accessing your bank account or submitting your paper as their school assignment. And they accomplish their goals well. So, your consent becomes complicated.

“If you rely on self-ordering – what we also call self-regulating – then the real regulator is the person that asked you to provide your data. That person then uses your data, or gives or sells it to someone else who will then use it,” he says.

“It’s not just about protecting the commercial interests of internet users. It is also about protecting the public from identity theft, fraudulent access to their bank accounts, and vilification. If mega-data corporations use personal data without consent, does their use of that data inflate profits? Does it aid predators who abuse that data?”     

Prof Trakman says people need to know what giving consent means in law, how their consent can be abused, and how to protect themselves from abuse.  

“The problem is that the internet is virtual and global. Personal data can be used anywhere, including in jurisdictions that do not require informed consent. Data collectors know these legal loopholes well, as likely do scammers and blackmailers who take advantage of them.” 

US regulators fined Facebook five billion dollars for providing personal data without consumer consent to Cambridge Analytica. But were the fines enough?

“Will they deter mass abuses personal data? Will they lead to regulatory guidance globally to avoid future replication of such abuses?” 

So far, the European Union (EU) has taken comprehensive steps towards regulating access to personal data.

“Singapore has done so to a certain degree, but they have stalled very large sector IT companies. In Australia, we’ve relied on existing regulatory framework of consent law. And then there are countries that have done really little such as China.”

Prof Trakman says there is no reference to consent to the use of personal data in the law in China although consent is conceived differently there.

“One of the reasons is that the protection of personal data is secondary to the protection of networks, systems and platforms in China. However, China’s Cybersecurity Law does require both the notifica­tion and consent of data users whose personal information is collected by providers of network products and services. Still, consumers should be more than incidental beneficiaries of regulated network systems,” he says. 

So what solutions do we have?

The main issue is the absence of a unified and global enactment of the law of consent.

Existing privacy and consent policies are long and often convoluted. A partial solution is to provide ratings systems or pictures that capture different options that are easy for consumers to read and understand.  

“The concept of consent to the use of data should be more consistently con­ceived and applied by countries like Australia, China and Singapore, and not only by regional associations such as the EU and APEC.”

Prof Trakman says the central response to such divergence across countries and regions is to redress the inability of data consumers to understand the use of the personal data to which they are consenting.

“We have provisions in the Organisation for Economic Co-operation and Development (OECD) but they are dated. Then we have possibilities which could work as a unifying force. Most countries have the vast majority of signatories, but first, we have to get countries to agree.”

The reality is that companies sometimes either don't want to, or they don't have the facilitation to administer such changes. As a result, people find mixed messages in what companies are willing to do.

Professor Trakman proposes the following in an attempt to raise awareness on digital consent and the need for data protection law:

  1. Build a regulatory framework based on past experiences, along with illustrations of how this can be achieved. The idea is to have the regulatory framework adopted and utilised, while recognising that different countries will do so differently.
  2. Observe individual national frameworks for efficiencies. In other words, have a common framework to identify factors that go along with the uniform framework. This includes reviewing areas where they could extend the framework to be more understanding of the difficulties in the dilemma such as traditional consent and traditional privacy.
  3. Develop a model law, which provides for variations such as opt-outs to allow countries to differ in how they applied it. This is one of the risks that you need to address in identifying the nature and the extent of consent. The EU is embarking on this pathway.
  4. Implement a risk management approach where you can address and manage the risks and identify who will be benefitting from this agreement – legitimate gains or not.

Professor Trakman’s paper on “Digital consent and data protection law – Europe and Asia-Pacific experience” is available here.