skip to main content
Episode 42

A Balanced View of Data Privacy

Created at July 6th, 2023

A Balanced View of Data Privacy

Considering a balanced view of consumer privacy that works for people and businesses, Sachiko Scheuing, Acxiom European Privacy Officer and four-term chairwoman of the Board of FEDMA, joins the Real Identity podcast to discuss what is being proposed as a more flexible approach to GDPR via the UK’s new DPDI Bill. But that’s not all! Federated IDs, AI, virtual reality, risks in a changing IT ecosystem and the “golden rule of data privacy” are all part of this don’t-miss conversation.

Transcript

Kyle Hollaway:

Hello, and welcome to Real Talk about Real Identity from Acxiom. This podcast is devoted to important identity trends and the convergence of AdTech and MarTech. I’m Kyle Holloway, your podcast host, and I’m joined by our co-host, Dustin Raney.

Dustin, I was reading a post recently by Barry Scannell, a leading AI law expert, concerning the EU Parliament’s AI Act. The Act seeks to regulate high-risk AI systems and foundation models like those on which ChatGPT are based. While we aren’t going to be discussing AI directly on today’s show, it’s certainly reminding me of how the EU Parliament is once again at the forefront of major consumer protection-related legislation. Similar to how the AI Act will have material impact on the marketing and advertising industry, the General Data Protection Regulation, or GDPR, has had significant impacts over the past five years, which leads me to the introduction of today’s guest.

I am super excited to introduce Sachiko Scheuing, European privacy officer for Acxiom and four-term chairwoman of the board of FEDMA, the Federation of European Data and Marketing, a Brussels-based trade association. She’s also one of the founders of Women LEAD, an Acxiom business resource group focused on gender equity in business. Sachiko is a dynamic speaker, passionate advocate for and expert leader in privacy, and we are thrilled to have her on the show today.

Sachiko, welcome to Real Talk.

Sachiko Scheuing:

Hi, Kyle. Thank you very much for the invitation. I do need to correct one thing though. I am not one of the founders of Acxiom Women LEAD, but I’m just one of the coaches. I joined a little bit later, but that doesn’t mean that I’m less interested in the topic. I’m, therefore, even more excited about this topic, but thank you for your invitation.

Dustin Raney:

Sachiko, so excited to finally get you on our podcast. We’ve, in the past couple of years, had a chance to really get to know each other while at I-COM in Spain. Before we talk about some of that, would you mind giving our listeners some insight into your background and how you got to where you are today?

Sachiko Scheuing:

Sure. Well, some long time ago in my previous life, if you want so, at Acxiom, of course, I was actually the chief analyst of the Acxiom’s Dutch office. I was doing that for a number of years, and my privacy boss came over to the Netherlands to say how excited she was about this privacy thing and it is really at the core of Acxiom’s business and blah, blah, blah, blah, blah, and that really made me think, “Hey, this new thing sounds so interesting. I’m going to do that.” That’s exactly what I did. I went to the CEO and I said, “Hey, that chief privacy officer came. Can you please get in touch with her and tell her that I want to do privacy here in Europe?” That was 2005. Since then, I’m doing what I’m doing today.

Dustin Raney:

We’re so glad that you did. One of the things that I’ve always admired about your positioning on privacy is that you’ve always held I feel like a really good balance, because it really is about a value exchange. If you go too heavy on regulation, for instance, on representing one side or another, whether it’s an advertiser, the publisher, the vendors, the consumer, then that imbalance could potentially be catastrophic to the industry and across all entities. Always loved your feedback and perspective on consumer privacy in general.

Sachiko Scheuing:

Yeah, actually, you raised a really, really good point because, interestingly though, you said it got so heavy on privacy and so on, but the GDPR, a law which we talk a lot all over the world, not only in Europe, actually says, “The right to data privacy is not an absolute right.” You always need to balance other rights and freedoms like the right to conduct business and so on. How you are phrasing it is absolutely spot on. This is how we need to take a look at the law is my point of view.

Kyle Hollaway:

Speaking of GDPR, and it’s coming up on five-year anniversary, your interest in privacy preceded that. Obviously, you got involved with privacy well before that time and now you’ve seen that legislation come through, and now it’s starting to mature a degree. Where do you see things heading with the regulation? Do you feel like, now that there’s been some instances, whether it’s some rulings against some of the major platforms and other considerations, do you see any pulling back or refinement of it? Do you see it actually extending? What are the prevailing headwinds right now?

Sachiko Scheuing:

Having lived with the GDPR for five years… and, by the way, the first involvement with the GDPR I would say was in 2008 when the European Commission said, “Hey, we want to know if this law is still up-to-date enough to actually regulate all the digital evolution that is taking place.” Well, we have certainly come far a way from that. I think what we can expect right now is that, in reviewing the GDPR, there are a couple of things that we need to take a look, what went well, but also what did not go particularly well.

I think one of the things that we really need to take a look is whether the GDPR is… I mean, going back to that previous conversation, five years of GDPR really makes us think about whether the law is working the way policymakers have intended. As to one thing, I did say that GDPR or the right to data protection is not an absolute right. There’s this thing called the recital, which is like a footnote of the law. In recital four of the GDPR, it clearly states that the right to the protection of personal data is not an absolute right, so I’m not making this up or something. It actually says it there. Maybe it is being understood in a way that is too rigid.

In any case, this was not only my understanding, but it seems like this was also the British government’s understanding because, due to the Brexit, they had a chance a little bit earlier than the European countries to reevaluate the effects of the GDPR. As a result, they are now working on a new law called the DPDI Bill. The improvement that I see at least in the draft today is that they start to understand that GDPR may be having a disproportional burden on SMEs.

Of course, we all know SMEs are really the backbone of our economy, so if we actually dampen them out, we can forget about our economy from blooming. I think that’s a really interesting thing. In particular, for the marketing sector, what I feel is really important is that there’s this concept of legitimate interest, which means making sure that you are the one who’s safeguarding and doing or putting all sorts of mechanisms to make sure that the data can be used, it is not going to be weighed against some other interests that the consumers have and so on.

This legitimate interest as a legal ground, in current GDPR in Europe, it is just once again in this footnote, in this recital section, but the UK DPDI Bill is now bringing that into the main law saying, “Hey, you can really use this data to do marketing and so on.” I think it is really interesting to see how the UK government reacted to the learning from the GDPR.

The other thing is I just want to talk a little bit more about this rigidness of interpreting the GDPR, because we all think, “GDPR, oh, it’s so strict. We can’t do anything,” and that is so wrong. GDPR is actually based on the so-called risk-based approach, which means, say you want to use a data for a certain campaign or you want to actually do certain analysis or something like that, but then you think maybe this is in a gray area, well, GDPR doesn’t say, “No. No. Gray. It’s not white. Stop it,” but rather GDPR says, “Are there ways that you can think of to reduce the risk so that it’s no longer gray, the risk becomes more acceptable?” If you manage to do so by means of putting extra security or pseudonymizing the data, blah, blah, blah, then it’s okay to actually use the data so it’s so much more flexible. I hope that this learning will be reflected in the European laws as well. We are seeing so many laws that are emerging in the area of data use and digital data use.

Kyle, at the very beginning, you mentioned this EU AI Act, which it’s really timely that you mentioned that because, yesterday, the European Parliament has adopted it, but you also have DMA, DSA, Digital Market Act, Digital Services Act, Data Act and all sorts of things. It seems like this is going to go on for another couple of years, but what does that mean? That means the goalpost is going to change very frequently in Europe.

By the way, what happens to Europe would usually spill over to other countries like the United States and Canada, to LatAm, to wherever, so my golden tip to everybody, and maybe I’m just doing myself a mega favor, work close with your privacy and legal people. They will be able to navigate you through the labyrinth of law, exponential legal growth I would say.

Kyle Hollaway:

Yeah. No. I think that’s a great call-out. I know that’s been a continued focus within our work here of ensuring that we’re doing those PIAs, the privacy impact assessments, to really take care of any business process and ensure that it aligns with those privacy components, but also have that dialogue internally even on not to over overreact or assume it can’t be done, but rather say this is what we would like to do.

Sachiko Scheuing:

Yeah. Exactly. It’s a dialogue.

Kyle Hollaway:

How do we fit that? Right. How do we fit that within the constraints of the law?

Sachiko Scheuing:

That’s why I love working with you guys. That’s why I love working with the engineers, because you come guys always come with challenges. It’s not like I’m going to say no or my team is going to say no, but we think, “Hey, what can we do with each other? You have this technical knowledge. Can we solve privacy problems with our technical knowledge?” That’s what I’m actually talking about when I talk about risk-based approach. If it seems very difficult, then let’s see what are the steps that we can take to make it okay again.

Sorry I interrupted you, Kyle.

Kyle Hollaway:

No. That was great. I did have one question because, obviously, most of our listeners are likely in the US and we’re not directly under GDPR, but speak a little bit to the extraterritorial aspect of GDPR and how it could impact US-based businesses.

Sachiko Scheuing:

Yeah. I think one of the big things about not only the GDPR, but other laws as well is this aspect of cross-border data transfer. As you know, there’s a lot of movement in that section. Think about the SHREMS I, SHREMS II, but also about the UK-US bridge which is being built or is already built. I don’t know. I think one do need to be have a strategy in dealing with this challenge.

I would really like to see a world that allows cross-border data transfer between different countries, provided that the same level of protection is actually given to the data-receiving country because to actually say, “Hey, cross-border data transfer is illegal,” is actually not taking reality into consideration. I mean, there is no national border on the World Wide Web, is it? We need to actually come up with a framework and also precautions to make sure that these risks are dampened down as much as possible, ideally eliminate these risks so that we can also support a global economic growth and innovation.

Dustin Raney:

It seems that I’m just hearing you talk about how in some ways maybe the markets overreacts to legislation because they might not completely understand it. They don’t know about these articles that you’re talking about that actually have a little bit more balance than they might think, but regardless of that, the market reacts, and people are tying this whole cookie-list thing directly to GDPR and CCPA, for instance, cookies not transparently tracking you across the system, browsers start deprecating. How did the market react? We start creating federated IDs. In the US, that looked like more based on a hashed email, so like, for instance, UID 2.0. UID 1.0 was built on a cookie. UID 2.0 was built on a hashed email.

In Europe, it seems as though the telcos are stepping in and building out a federated ID more on the phone number because telcos have phone numbers. We heard one of the companies coming out called EUTIC is basically a co-op of the major telcos in Europe trying to build a federated ID off that phone number spine for addressable media and advertising. On the other side, we heard people trying to mimic Google and build anonymized browser-based IDs. We saw the two camps starting to rise. We’re seeing that globally honestly, some saying, hey, federated, deterministic ID built off of your PII, others saying cohort browser-based ID.

What are your thoughts on what’s happening there? Do you feel like either solution is really a viable solution?

Sachiko Scheuing:

Well, in principle, I think the more options there are the more helpful it is for the marketeers because there are different ways to solve one solution, you’re not actually stuck. Let me actually first of all start by commenting on this telecom initiative. I think this is going to be really, really interesting because, as a telecommunication sector, regardless of which region of the world you go to, they are usually really, really tightly controlled. I think they are indeed doing this. I think they’re going to deploy a lot of pseudonymization and perhaps even anonymization processes to make sure that it’s super compliant, which also is acceptable under the telecommunication regulations and so on.

I think about other solutions and so on. Well, if I were a marketeer and I’m sitting down and I’m thinking, “How am I going to actually build up my first party identity?” or maybe it is even like, “How am I going to make sure that my campaign measurement is actually going to work seamlessly?” and so on, one of the things that marketeers are starting to ask, because their compliance departments are coming to them and telling them that you need to take this into consideration, is that they will start saying like which channel, which solution will have the highest level of trust and would actually mean that they will be protecting our brand reputation the most?

I think that is the questions that will be used to determine which solutions they would use. Maybe they will have a portfolio of different solutions. I mean, we will see, but one thing I can say, I really like this evolution, that many clever people are coming up with really interesting solutions, and I think it is really a hot area to keep your eyes on particularly for privacy people.

Dustin Raney:

Right. I totally agree. I really want to get your thoughts, Sachiko. We’re talking about Google. It was FLoC and then it went to Topics, but basically storing an ID on your device likely without you really knowing what’s happening. It’s contextual, but it’s anonymous. Regardless, it’s an ID that’s keeping up with you basically showing what contextual information that you’ve been interested in as you brows the internet. That’s I think essentially what it’s doing.

Do you believe that, if I’m a consumer, that it’s meeting my expectations as far as transparency, as far as compliance even from a regulator’s perspective? Do you think explicit consent should be required to put any ID on a device?

Sachiko Scheuing:

Well, if I would actually start by talking about the European market, there is this thing called the ePrivacy Directive. It is not really GDPR, but this ePrivacy Directive says, “You need consent.” If this law says you need consent, then you need consent. You don’t have any other choice. The thing is though exactly what you say is being debated a lot in Europe at the moment, so let me first of all start talking about the idea of why would anybody want to have, give the consent or give the control of the consent, control of the data in the hands of consumers?

Maybe for the US’ point of view, this is a little bit strange because, if a company goes out and collects data, I don’t know, I’m just guessing, correct me if I’m wrong, perhaps the data that is collected would belong to that US company that actually put in the effort, invested in people to collect that data. The thinking behind data protection, particularly the modern or the current data protection, actually goes back to this thinking called informational self-determination. It sounds very, very hardened, and that’s because it comes from Germany though, to actually be quite honest, it actually comes from a Greek-born professor, Professor Spiros Simitis, who’s actually referred to as the international father of data protection.

Anyways, this guy said everybody should be able to control his or her own data. You need to think. It was in the ’70s. We didn’t have internet. We didn’t have apps. We didn’t have all these things that are normal to us today. The question is is this concept still meaningful to us today? Some of the things that were derived from his thinking is still in place today and being adopted in the United States, too. Like reporting, you say that under this CCPA. In Europe, we will actually use the word rights of data subject, the right to know what information is being collected about you, right to correct the information if the information is wrong and so on.

It is still there, but, here, the reason why I actually asked the question whether that is still meaningful today is because how can a nontechnical, nonlegal person understand exactly what is being done with the data even after reading the privacy policy or something like that? You say things like, “We use your data to generate inferences to create a profile about a consumer reflecting the consumer’s preferences of characteristic.” It’s like, “What?” For most of us, I think one of the questions that we need to raise is is it fair? Also, privacy policies are not particularly concise. They’re usually in legalese and really, really, really long. The question that we need to ask is whether it is fair to expect that consumer is able to digest that information and make a decision about that.

My thought about it is I think we really need to empower and educate ourselves. You see, we need to educate our children, but also our adults, too. We need to be able to make it easy for everyday people who are not lawyers, who are not technical people to understand what these sentences actually mean. That is my thinking. It’s not that important. More important is what, for instance, the European commissioners are thinking about.

I think this is going to be really, really interesting because the DG Justice, the consumer protection wing of the commission, is now coming out to say, “Hey, look, I know that we need to do everything on the basis of consent, but then, frankly speaking, there’s this consent fatigue.” It’s because it’s not centralized. You need to actually consent on any and every website that you visit or app that you download or whatever, so they have started this initiative called the cookie pledges and, what it is, it is actually a voluntary framework. It is not going to be like law, “You have to do it,” but it’s like you can actually as a company try to make it easier for the consumers.

One of the ideas that are there is to avoid, consumers, to click so many okay buttons is to actually say, hey, have that, put it as a browser setting and the browser’s administrator’s consent. Well, of course, that’s a good idea, but people also debate, hey, wouldn’t that give a disproportionate amount of power to those browsers that are so powerful? Any and every idea that actually will be there to improve the situation is welcome. A good idea would be to actually have the publishers unite and actually have a central opt-in mechanism and even something that says, hey, you know what, I actually want to see more advertising about fountain pens. I like fountain pens. I don’t collect them because they’re so expensive. I don’t have a big collection, but I like fountain pens, and then they know that this is what I’m interested in, instead of me spending hours and hours looking for different websites. If that solution through a network of publishers, for instance, is able to provide me that information, I think I’ll be really, really grateful.

Having said all that, yes, indeed, that is something that we are also discussing in FEDMA that you mentioned, Kyle, at the introduction. Next week, we are going to discuss five years of GDPR. We have indeed invited the legal policy person of the DG Justice of the Consumers of the EU Commission, so, yeah, watch the space.

Kyle Hollaway:

I’ve got a question, and I don’t know if it’s really a question or just a thought and get your reaction to it, but I think one thing that I’ve continued to wrestle with in this whole category, and you were talking about giving the consumer the right, the control. Part of the question is that’s kind of predicated on the assumption that the consumer actually knows what’s best for themselves, but consumer knowledge is very limited in many ways. You don’t necessarily know, like you said, it’s a very complex ecosystem, what brands are doing. It’s hard to understand really how it all functions, so, A, does the consumer really have the capability to really make an informed decision or are we just putting it in the hands of somebody and just going like, hey, make this important decision? You probably don’t know. At some point, like with my car, I eventually defer to the mechanic because I don’t really know how the car works, and so I’m like, hey, if you think that’s best, I have to make an assumption there, so there’s some implied trust.

Do you think we can get to a point where there’s a balance between trusting the experts to do things or do we always need to push towards the lowest common denominator, which is the individual, and so it probably is balancing a little bit of individual rights versus what’s best for the population as a whole because it’s more efficient? Like you said, it’s enabling brands, especially those small to medium businesses, to not have an onerous outcome or to be precluded from something just because somebody made an uninformed decision. I don’t know if I’m making sense here. What is your view on that? Is there a point where individualism and the individual rights we can overcorrect to that and that we need to find some balance between?

Sachiko Scheuing:

Well, Kyle, you are spot on. You are really spot on. That was one of the issues GDPR wanted to address, and it has done so by basing itself on this principle called the accountability principle. Basically, it is indeed so. The consent, I’m not really sure if consent is fair because, like you say, what you’re doing is you’re shoving the responsibility to the consumer, “Make that important decision,” and then if later on the consumer complains, well, you’ll be like, “Well, you clicked on okay.” It’s in a way really, really unfair, so that’s why.

What I heard back in the time before GDPR came into effect, I was at one of those seminars by the European Data Protection Supervisor, back in that time, it was Mr. Giovanni Buttarelli, an Italian person, and he was like, when I take a look at the different… We call it legal grounds in Europe and in many other countries that adopted GDPR like laws. You actually have five or six reasons why you can use personal data. To carry out a contract, it is in the interest of the national security or public interest or whatever, and then you have legitimate interest, and then you have consent, and then the guy said, well, take a look at all the legal basis upon which you can base your use of data and, if you can’t find anything, then use consent, not the other way around.

That approach also makes sense because the company is actually using or the organization is using the data. Are they responsible of protecting the data in such way that the right to data protection of the consumers and everybody would be protected in the environment and also in the way that they’re used? I really do want to see this accountability principle taking a stronger root in the GDPR, but also in other countries. In Singapore, I think they have really hit a great balance how you can use techniques such as pseudonymization, anonymization. Well, anonymization is, of course, anonymization. Let’s say pseudonymization, strong pseudonymization to actually decrease the level of risks that is there in using the data.

That’s one side. My strong feeling is I really do want companies and organizations to be more accountable as the GDPR says. The other side is, however, hey, those who are interested and who really want to exercise their right, they should be encouraged and empowered to do so. That’s why I think it is not just, “Do the accountability and don’t do the education.” I really do think people need to understand what’s going on. I really think it is important that we provide continuous education.

One of the things that I heard, this is already 10 years ago, from a regulator, I would perhaps even say from a regulator from a certain country, this person said, “Well, we try to familiarize people from our country with the concept of data protection.” What we do is, for instance, I think whatever that country’s language, but let’s say English because it’s easy, in their English textbook, they would actually have stories about data protection so that, unconsciously or very naturally, children would learn that, hey, we have these rights and we can take control of the data if we want to, and this is what it means. I would like to see more of this educational effort coming out from all over the places, actually.

Kyle Hollaway:

Yeah. That’s a really interesting thought there, more institutionalizing that effort to educate earlier in a life cycle so that, like you said, it just becomes more of an innate understanding versus a lot of us that have been around in a non-regulated world for a long time starting to, trying to absorb and understand that regulation. I knew this would happen, that we would start talking and that we could talk forever because there’s so many interesting parts to this discussion and everything. We’re starting to run out of time, so we’re going to shift the conversation just for a second. I want to be forward-looking. I know we’ve talked a lot about legislation, even some things that are currently about to happen, but let’s touch on the topic of AI for a minute, because AI I think enters and opens up a whole new area of privacy considerations because of its ability to massively process data and data points that maybe aren’t even ones that we’ve traditionally thought about.

I was reading just an article yesterday where just human locomotion, our motion, is enough that AI can actually determine individual identity just by looking at a video of somebody walking, and they can begin to process that across the hundreds of thousands of people that are maybe walking down the street, and it can start to really identify just from your gate who you are, how we hold ourselves because of our skeletal structures. That kind of blows my mind to start thinking like it’s not even biometrics as I would have understood biometrics, but it’s actually observable metrics that now suddenly become something that can identify me with. Give me some of your thoughts on AI. What are you thinking about it in terms of privacy as a chief privacy person within a large enterprise?

Sachiko Scheuing:

Well, I think it is really a double-edged sword, AI is, because some of the new users of AI, I mean your example was a little bit scary, but they’re also really, really practical, but mind-blowing examples. My sister is in the creative side of marketing, and she made a chatbot for a pharmaceutical company in the United States, and then they were like, “Oh, we also are in the Japanese market,” and then she used AI, and it was so flawless it. She was like, “You can just use it one-to-one in Japanese.”

Also, the other thing she was actually doing was that she was actually building a complete marketing suite with branding and a landing page, chatbot, social media posts and so on. Those are the things that creative bureaus usually spend hours and days working on in 25 minutes. Can you imagine? I’m like, “Oh, my goodness, do you still have a job in a couple of years?” and then she was like, “Get this. After five minutes on top of that 25 minutes, so you have half an hour, the entire campaign can be localized for Brazil, for China, for Japan, for the UK and so on.”

The potential I think is really, really huge, but the other side of the sword is, of course, that AI creates fact. For instance, the ChatGPT, I think it is really, really dangerous to actually use ChatGPT’s answers and as a source. The other dangerous use of AI can be like it can create so many scripts in no time. I think, if hackers get a hold of it, and I bet they have already, it will really, really make the IT ecosystem a dangerous place.

I think we all agree, or I hope we all agree. I think banning it is not going to be the option, because imagine if we would’ve banned data due to the uproar about census and all that back in the ’80s, we will not be having this comfort in innovations that we are enjoying today. Same thing can be said to AI. I think we need to actually find a way to live with AI in a responsible manner, and only so I think we can reap the benefit while having the AI used in a controlled, ethical manner.

Dustin Raney:

As consumers having gone through Covid, having gone through digital acceleration, all this stuff with our data, AI, there’s a lot of change happening. There’s a lot going on. I think there might be a need for a consumer therapy sector to grow to help us understand our value again. I think that might be one of the foundational things is that, yes, technology, it’s going to be required to keep us all together and help us have more meaningful experiences, but, oftentimes, I don’t know that the consumer understands how valuable they are and how potentially abusive some of the players in the ecosystem have been with their value. I think, at the end of the day, my hope is that that is balanced out a little bit, that despite what’s happening with AI, ChatGPT, even AR, VR, where Apple will just released their new headset, which is going to be mind-blowing-

Sachiko Scheuing:

Will you pay $3,400 for that stuff?

Dustin Raney:

Yeah. I mean, it’s like some people will, right? Eventually, that’s going to be mass adopted, and we’re going to be walking down the street and, like Kyle said, I can look over and see someone’s gate and be like, oh, there’s Kyle walking down the street in New York City amongst the hundreds of… If he says, “I can do that,” right? Will the technology exist for you to individually say, hey, I don’t care what everybody knows about me or what specific people know about me. I think that’s where technology is going to come in. Maybe give the consumer more control because, some people, they want to be popular, they want to be known, they want to be, whereas other people want to be super private. They don’t want anybody to know them. I think there might be a balance. What are your thoughts there?

Sachiko Scheuing:

Yeah, I think everybody’s different. That is indeed true. Of course, these people who are saying, “I don’t have anything to hide. Everybody can know everything about me,” until one of these days, you’ll be like, “Oh, I shouldn’t have made that part transparent,” and so on. Human being grow up and we mature and we look back about all these things that we did during our college years and so on. I’m not really sure if your decision back in the college time would still be valid today. Once again, I think that comes down to people being aware of the different consequences, people being aware of the rights.

Speaking of rights, I think it is really interesting to read the United Nations Human Rights Declaration or, people who are interested in the European side of the story, the derivative thereof, which is the European Charter of Human. It’s really interesting. You know what? To sum it all up in my own words, it’s about showing respect and being respected. It’s no different. It’s our basic, fundamental social etiquette, if you want so, and it’s just being transported over to the digital sector. I think educating yourself, reading around, being respectful to the others, regardless of what you do in whichever form, I think these are very important for ourselves.

Dustin Raney:

Kyle, I would call that a mic-drop moment for Sachiko.

I don’t know that there needs to be anything else said in this episode, but, hey, Sachiko, we are out of time, unfortunately. As we knew this would be an incredible, very informative, taking tech and humanizing it, thinking about people, thinking about all the players bringing a balance. We hope our listeners got as much out of this as we did.

Thank you all for sticking in with some of these. I know when you’re talking about privacy and compliance and regulation and technology, it can get heady, but it’s important. Thank you for bringing clarity there, Sachiko. We look forward to having you back again sometime in the future.

Thank you, everyone. Thank you for joining us today, and we’ll catch you next time on Real Talk.

Sachiko Scheuing:

Thank you very much.

Dr. Sachiko Scheuing

European Privacy Officer

Dr. Sachiko Scheuing is the European Privacy Officer of Acxiom, an IPG-Interpublic Group company, and a global leader in marketing services. Sachiko combines the theoretical and practical experience she gained over more than 20 years to spearhead Acxiom’s government affairs and compliance activities.

More from Dr. Sachiko Scheuing Connect on LinkedIn