The Goldilocks Problem: AdTech and other complex processing
It’s been less than a year since our inboxes were flooded with “Don’t leave us!” emails as the GDPR came into effect. This week the French data protection regulator (the “CNIL”) used its increased powers under the GDPR to issue the first significant financial penalty – fining Google €50 million for lack of transparency and valid consent to serve personalised ads to users.
In this blog post, we reflect on the broader implications for AdTech and similar data-heavy businesses. How can you describe complex processing activities without overwhelming the individual with information? When you give individuals choices, is there a risk of providing too much choice? Like Goldilocks, how can you ensure your customers get just the right amount of information and control?
Background rumblings on AdTech
Scrutiny of the advertising technology or “AdTech” industry has been growing among European data protection regulators and privacy campaigners for some time.
In June 2018, the CJEU decided that where personal data is gathered on a Facebook fan page to serve personalised ads (among other things) both Facebook and the owner of the page are controllers. In November came Privacy International’s submissions against a number of AdTech companies, which it considers are exploiting individuals’ personal data without appropriate permissions. Then, just a few days later, the CNIL published its findings in relation to Vectaury, a French AdTech company which relied on invalid consent to create profiles about individuals and participate in bid auctions for targeted advertising.
Online behavioural advertising sits at the intersection between data protection and eCommerce regulation. Much of the tension between data protection regulators and the AdTech industry centres on the industry’s perceived lack of transparency and the lack of control individuals have over their personal data in the digital advertising ecosystem. This is due in part to the complexity of this ecosystem, in which the process of getting an advert from advertiser to target audience can involve a number of parties and the aggregation of personal data across multiple sources (often in ways an individual may not expect).
What about Google?
Some of these issues came to a head in the CNIL’s investigation into Google, prompted by complaints made by privacy campaigning groups None of Your Business and La Quadrature du Net, who claimed that Google did not have a valid legal basis for processing personal information to send targeted advertising.
The CNIL’s investigation looked at the customer journey of creating a Google account on a new Android smartphone, and identified two key breaches of the GDPR:
1. Provision of information to individuals
The CNIL decided that the information Google provides to customers to explain how their data is being used is not easily accessible or understandable. While Google provides a great deal of information to customers about how it uses their data (such as the processing purposes, the storage periods and the categories of data used for ad personalisation), that information is spread across numerous policies and notices, with different buttons and links that the user must click on. In some cases, a customer would have to take five or six steps to get to that information. Moreover, the information in those polices and notices was sometimes vague and imprecise – for example, not providing concrete retention periods.
2. Legal basis for processing
Consent is not the only lawful basis for processing, and in many cases is a last resort, given the strict requirements placed on it. In this case, Google tried to rely on consent, but the CNIL decided the consent was invalid. Firstly, customers had not been properly told what they were consenting to. Details of how a customer’s data would be used for ad personalisation were spread over several policies and notices. Secondly, customers did not actively agree to that use. The default was to use a customer’s data for targeted advertising, and to change that position the customer had to click on a button and untick various pre-ticked boxes. This is a breach of the GDPR, which excludes the use of pre-ticked boxes. Finally, the consent covered multiple different processing operations carried out by Google (ad personalisation, speech recognition, etc.) rather than being “specific” for each purpose.
The CNIL justified its decision to fine Google €50 million - and to publish details of the fine - on the grounds of the severity of the breach, the infringement of the core concepts of transparency and consent, the level of intrusion, the fact that the breaches were continuous rather than a one-off, the prevalence of Android phones in the French market and because it perceived targeted advertising as being part of Google’s business model.
Google has said it will appeal this decision.
Emerging patterns on transparency and the use of consent
Across Europe there have been similar rumblings regarding transparency and consent in the AdTech ecosystem. One example is the CNIL’s action against Vectaury.
Vectaury is a French AdTech company that uses geolocation data from smartphones to profile individuals and offer them targeted advertising. As with Google, Vectaury failed to demonstrate it had an appropriate legal basis to process personal data, in part because the consent it relied upon was not informed, clear or affirmative. In the version of the Vectaury consent management platform considered by the CNIL, individuals had to trawl through pages of privacy information to disable the default authorisation for the use of their data for targeted advertising. They were not told at the time of installation of the Vectaury partner’s app that their geolocation data would be used (by which point that data had already been transferred to Vectaury), the purposes of use were drafted in unclear terms, and key information like the identity of the controllers in the AdTech ecosystem who would be using their data was not provided.
Notably, Vectaury relied on consent obtained by third parties. Personal data from real-time bids on Vectaury partners’ applications had passed through a number of intermediaries before arriving with Vectaury (a process in relation to which it appears Vectaury did not have oversight). The CNIL decided that that smartphone users were not sufficiently informed about this process, and their consent was therefore invalid.
Privacy International’s complaint flags many similar issues, focussing on the “interminable data sharing” between different players in the AdTech market to create individual profiles, which results in unclear chains of consent and inadequate fair processing notices. Privacy International raised additional concerns about the creation by these companies of detailed and comprehensive profiles of individuals, from which it may be possible to infer highly sensitive information. They argue that, as many of these companies are non-consumer facing, individuals have no way of knowing how their data is being used and, as such, cannot provide valid consent to that processing.
Who is the controller?
Another problem is that the AdTech ecosystem does not map neatly into the controller/processor framework set out under the GDPR. The industry is making concerted efforts towards GDPR compliance (for example, with the launch of the IAB’s Transparency and Consent Framework), but regulators clearly feel there is more work to do. One piece to the GDPR compliance puzzle may be hiding in plain sight. European Courts have considered the notion of joint controllership three times over the last year (twice in the context of AdTech), shedding some light on the role of the controller in this sphere.
In Schleswig-Holstein, the CJEU found that a German education company that set up a Facebook Fan page was a joint controller of the personal data processed by Facebook’s cookies on that page, even though it did not have access to that data. This was because it provided Facebook with the opportunity to process personal data and had influence over how such data was processed. The court came to much the same conclusion in Tietosuojavaltuutett v Jehovan todistajat, deciding that the Jehovah’s Witness religious community and its members who engaged in door-to-door preaching were joint controllers of notes resulting from that activity - further clarifying that “joint responsibility for several actors involved in the same processing activity does not require each actor to have access to the personal data”. These cases may reflect the lack of any concept of secondary liability in EU law. Faced with a regulatory lacuna for those assisting or encouraging processing by others, the Court of Justice has simply stretched the scope of primary liability.
A more recent case, Fashion ID v Verbraucherzentrale, looks at the scope of responsibility and liability as between joint controllers. It relates to an online fashion retailer that embedded a Facebook ‘Like’ button on its website. In his opinion, Advocate General Bobeck suggested that a joint controller should only be held liable for the phase of data processing under its control, rather than the preceding or subsequent stages of the overall chain of processing for which it was not in a position to determine either the purposes or the means.
These cases have significant implications for actors in the AdTech space, particularly publishers who have historically borne little responsibility or liability for the personal data collected and used by AdTech companies placing advertisements on their website. The AG’s decision in Fashion ID v Verbraucherzentrale suggests a more pragmatic approach that reflects the multi-stage journey of personal data through the AdTech space and may have a positive outcome by encouraging industry players to better define their roles and responsibilities in relation to the data they process. Equally, the much-awaited EU ePrivacy Regulation is likely to regulate this industry more clearly and to allocate the responsibilities of publishers and AdTech providers.
Providing the Goldilocks solution
Underpinning all of this is the question of how to provide users with meaningful information and control, especially where the underlying processing operations are complex and involve multiple actors.
The solution must start with privacy notices. They need to be simple and concise, particularly when describing more complex or intrusive processing. While best practice is to take a layered approach, this does not mean a warren of interlocked terms and conditions.
Basing processing on consent should be a last resort in many situations. Where consent is used it must meet the strict requirements of the GDPR.
This means it must be:
- supported by details about what the individual is consenting to - those details being concise but also sufficient to describe the processing properly. This is a challenge in an industry where data is shared on such a large scale, but one which should be tackled by providing a clear and comprehensive notice. When relying on consent - if you can’t explain it, you can’t do it;
- evidenced by some form of affirmative action. Pre-ticked boxes and default acceptance settings aren’t going to be sufficient;
- as easy to give as it is to withdraw. This requires efficient processes for consent chains; and
- not overly complex and burdensome. Individuals need to be given some choice but not too much choice. Overly complex mechanisms may not provide valid consent.
So – not too much information and not too little. What does “just right” look like? The acid test may be to put yourself in the shoes of the individual. Ask yourself: can you really understand where your information is going and why?
By Richard Cumbley, Olivia Grimshaw and Dennis Holmes