skip to main content

Data Clean Rooms: Balancing Innovation with Privacy Accountability

  • Jordan Abbott

    Jordan Abbott

    Chief Privacy Officer

Created at May 30th, 2023

Data Clean Rooms: Balancing Innovation with Privacy Accountability

Acxiom has offered a version of what’s now referred to as a “data clean room” for more than 20 years. 

Since the late 1990s, we’ve been working with brands, especially in regulated industries like financial services, to essentially provide them with pseudonymized datasets in an environment where they can generate customer intelligence to fuel campaigns – all in compliance with federal laws including the FCRA.

So while there’s lots of excitement around data clean room technology, it’s also worth cutting through the hype to see what hasn’t changed: namely, the need to get the data right and to be a good steward of it. 

This data stewardship mindset is critical, even when (or perhaps especially when) new technologies and innovations appear to remove a lot of the burden around compliance and privacy accountability. 

Data clean rooms increase possibilities – and responsibilities

Fast forward a couple of decades from the early data-sharing days, and data clean rooms have taken things to a whole new level. The opportunities for data partnerships have reached new scales and whole new use cases, so in many ways they’re a marketer’s dream. 

But if data clean rooms give brands data-sharing capabilities “on steroids,” there’s a flip side to that coin. Your privacy obligations haven’t gone away. In fact, in many ways, they’ve been taken to a whole new level, too. When it comes to security and compliance, there is also complexity hidden in the apparent simplicity of all this intelligent data-sharing.

Data clean rooms offer advanced privacy controls

Data security and privacy go hand in hand, but here’s one analogy I’ve seen used to explain the relationship between the two. If security is like the lock on your living room window, privacy is the blinds. In other words, you can have security without privacy (a locked window with the blinds open), but you can’t have privacy without security (the blind isn’t much use if the window’s open).

Another shorthand that’s useful in the context of data clean rooms: if security is about access to data, then privacy is about use of data.

It’s important to state that data clean rooms do not exempt controllers from compliance with consumer and data protection laws like GDPR, CCPA – and other state-level laws in places like Virginia, Utah, and, just recently passed, Iowa. This is one element of my warning to brands when I say don’t be misled into thinking clean rooms remove compliance complexity and privacy obligations from the picture. 

But assuming the appropriate privacy notices are in place at the point of data collection, clean rooms should facilitate broader uses that benefit not only businesses but also people, who will ultimately be able to enjoy better, more relevant experiences thanks to better and more controlled data use. 

Greater controls give you greater accountability

Fans of the Marvel Universe will know Uncle Ben’s advice to Peter Parker: “With great power comes great responsibility.” The same rule applies to brands when they’re entrusted with people’s data. The good news is that while data clean rooms come with extensive privacy responsibilities, they also offer brands enhanced abilities to fulfill their obligations and, importantly, to demonstrate their accountability. 

A key consideration about data clean rooms is that you’re not actually physically sending data from one brand to another as controller or processor – or to Acxiom. You’re simply granting limited, highly pseudonymized access to certain tables of data, which come with their own data privacy frameworks and usually extra measures like added “noise” (i.e., false data) and encryption to increase privacy protections. And more intelligent reporting on access and usage means brands have greater visibility and control when it comes to auditing and accountability.

When it comes to adding noise to datasets, how much you inject will depend on the context. For example, use cases like advertising for brand awareness will tolerate significant added imprecision, whereas highly personalized financial service offers may demand the highest precision. 

And your data privacy obligations will also vary with the degree to which the data in question is identifiable. With data clean rooms, we’re largely talking about data that is highly pseudonymous, even to the point of potentially being non-personal.  In a well-designed clean room, participants receive insights based on queries, as opposed to actually viewing another clean room participant’s data.  As a result, a well-structured data clean room could lessen some of your and your partners’ compliance burdens.  For example, a properly designed data clean room should comply with the CCPA as amended by the CPRA so that a sale or sharing does not occur when a partner analyzes the data.    Similarly, the partner should be able to avoid obligations ordinarily attendant to the collection of personal information.  

A potential solution to privacy complexity

As with just about every innovation in the adtech world, there are hidden depths and complexities with data clean rooms that brands must be aware of. And these complexities are compounded by a fragmented privacy landscape across the United States. 

It’s yet another reason why I firmly believe that a national privacy law is necessary. It could help codify the processes and protections around data clean rooms and remove many of the difficulties brands face today when they have to deal with a patchwork of state laws. 

This, in turn, would empower brands to fully embrace that data stewardship mindset I mentioned earlier with clarity and confidence. Ultimately, it would lead to increased adoption of technologies like data clean rooms, which would be a plus for the brands using them, as well as the people who will benefit from greater privacy and security protections in the experiences they enjoy with brands.

Jordan Abbott

Chief Privacy Officer

Jordan Abbott is Chief Privacy Officer of Acxiom. He advises key stakeholders on legal, data governance and compliance policy as well as handling government relations, where he provides strategic insight on proposed legislation at the state and federal levels.

More from Jordan Abbott Connect on LinkedIn