The question of what’s sensitive may seem like a simple one, but is in reality quite complex.
For decades, we have defined some data as sensitive. For example, our Social Security number has always been sensitive because it is a key piece of data that identity thieves need to steal our identity. Other examples include financial data, such as bank account and pin numbers, or health records. This kind of data is typically considered sensitive because it either aids identity theft, like SSN does, or could otherwise embarrass us if it were disclosed to others, in the case of our health records.
Some definitions of sensitive data are codified into law. The Graham Leach-Bliley Act (GLBA) and the Health Insurance Portability and Accountability Act (HIPAA) require stronger protections and affirmative consumer consent for certain uses of financial and health data, like marketing. Other examples include the Video Privacy Protection Act (VPPA) which imposes restrictions on the use of data about the videos we rent. The Driver’s Privacy Protection Act (DPPA) places special restrictions on the use of driver’s license data. Yet another example in a library (physical or virtual) is the right to keep the books and articles we read from being examined or scrutinized by others.
In more recent years, we have seen industry continue to evolve the scope of what we consider sensitive data. Industry codes of conduct have identified such data as biometric data and precise location as sensitive because how revealing it can be about what we do and where we spend our time. These are good examples of how sensitive data evolves over time as new types of data are created, usually through advances in technology.
Be careful with your use of use of data as well as your messaging if you use any of these types of sensitive data. You may offend your audience more than delight them.
A second consideration in defining sensitive relative to vulnerable populations. We see this in law with the Children’s Online Privacy Protection Act (COPPA). Children under the age of 13 have special protections because they are a vulnerable population. There is interest in setting up something a little less restrictive for teens, who aren’t as vulnerable as children, but certainly not as mature as adults. Other discussions focus on the older generation who grows in numbers as we live longer. Couple this with how degenerative disorders, like Alzheimer’s, affect the ability of seniors to avoid being scammed, and it logically raises questions about whether the elderly should be considered a vulnerable population.
Another example of vulnerable population is minorities who have been considered a vulnerable population for some time who should be protected against discriminatory practices. In law Reg B of the Equal Credit Opportunity Act (http://www.federalreserve.gov/boarddocs/supmanual/cch/fair_lend_reg_b.pdf)
lending practices which may discriminate against protected populations based on race, color, religion, national origin, sex, marital status or age are addressed.
Extra precautions should be taken when marketing to any of these vulnerable populations. Know the law, and offer appropriate products and services.
Uses of Big Data to Include and Exclude
As big data and sophisticated analytics become more common business practices, it get’s easier to avoid using these specific sensitive data elements discussed earlier, and instead use other non-sensitive data as a surrogate. This can be done intentionally, or totally unintentionally. Because targeted marketing is all about identifying a specific audience, even an at risk audience, we have to consider the impact of our marketing efforts.
The FTC’s public workshop last year, Big Data: A tool for inclusion or exclusion? (https://www.ftc.gov/news-events/events-calendar/2014/09/big-data-tool-inclusion-or-exclusion), focused on these risks. The workshop pointed out that the same data can be used in a positive way to include vulnerable populations with appropriate offers, a positive outcome, or it can be used as a tool to exclude them from better prices, or target predatory products and services, a negative outcome.
Edith Ramirez, Chairwoman of the FTC at a speech at the Media Institute in 2014 said, ‘Big data also presents the risk of what others and I have called “discrimination by algorithm,” and what the White House has called “digital redlining.” Big data analytics raises the possibility that facially neutral algorithms maybe used to discriminate against low-income and economically vulnerable consumers. There is the worry that analytic tools will be used to exacerbate existing socio-economic disparities, by segmenting consumers with regard to the customer service they receive, the prices they are charged, and the types of products that are marketed to them.’
All marketers should be aware of how they are using sensitive data, but maybe more importantly, they should also be aware of how their marketing practices could be used for exclusion and taking advantage of vulnerable populations.