You’re in a rush when you get to the drugstore to pick up a few things on the way home, so you don’t really pay attention to the camera hanging over the entryway and pointed down toward you.
Maybe you should.
That’s because that camera is likely not only taking a high-resolution snapshot of your face but also storing it in a database for as long as that large drugstore chain chooses, with no other information associated to you. But every time you walk into that drugstore, or any of their other stores for that matter, the same system will capture another image of your face, and the algorithms built into the software will match the image to the ones it took the other times you came into that drugstore chain. And even though it won’t have your name and address, it will know an awful lot about when and where you’ve entered its stores just by having access to that data.
But the day could come when the company decides to match that face to you each time you complete a transaction at the register, giving it a bunch of interesting data about your habits and your purchase history.
Oh, and the next time you come into the store? The system will match your face automatically and maybe send a push notification to the pharmacist that you’ve arrived to pick up your prescription.
It’s all perfectly legal. Systems like this are already in place not only in drug stores, but in casinos and airports as well.
Who owns the rights to personally identifiable information?
Do consumers “own” the rights to the storage of their physical appearance? What about its storage and usage in systems that you have no control over?
Even in the wake of the Cambridge Analytica fiasco and the various data breaches we’ve all read about, we continue to give away a tremendous amount of data every day that could easily be used to identify us even when we don’t think we’ve agreed to allow for that data to be used that way. As platforms collect more and more data from our IP addresses, social media accounts, phones and other devices, it’s become pretty easy to triangulate that data and create a “known” profile about us even if we haven’t explicitly entered that data into a form on a website.
And once we are known, there may be some increased risk around the availability and usage of our personal information.
HIPAA regulations define “Personally Identifiable Information” as any information about a person that could be used to identify them individually: name, address, phone number, medical test data, that sort of stuff.
Did you get one of those DNA heritage kits as a gift last holiday season? You probably logged into a website to see the results, but did you give any thought to the fact that your entire genetic footprint is now stored in an online database? More importantly, what if that test revealed something about your own health or your family’s genetic predispositions to certain diseases that you’d prefer was kept private? Did you really pay that close attention to what you actually agreed to when you clicked “Accept” on the privacy page of the website?
Given that such personal data is now being stored in databases, and that consumers are giving up that kind of data more freely than ever before, what happens to that data and how easily it’s accessible has to be top of mind for every marketer.
Our challenge as marketers is not the collection of data – we are already collecting it, and lots of it. The challenge is understanding what data is considered “personally identifiable” and would be subject to HIPAA or GDPR regulation. There’s more to this question than meets the eye, because we now need to discuss with our compliance teams how we are handling and managing this kind of data:
- Anonymous data that becomes attributable based on later behavior
- Data that contains personally identifiable information but not according to the traditional definition (the genetic information, for example)
- Data scraped or aggregated from social media platform behavior that you may have opted into, but are collecting far more data than you realize
- Information about you that’s captured without your explicit approval or authority but could become personally identifiable (that face capture system)
These are challenges that have not only regulatory implications but moral and ethical ones as well.
As the technology your organization uses becomes more sophisticated, it’s more important than ever that your company’s policy toward the collection and handling of this type of data is clear and well-considered, because it’s not just your DNA and your face that’s being captured – it’s your fingerprints, and your retinal patterns and more. And all that data is being stored likely alongside all that browsing and purchase history.
Aprimo can help your organization navigate these and other data privacy and compliance regulations for all personal data you collect. In life sciences, Aprimo helps global organizations stay compliant while improving speed to market. And our Financial Services customers now are able to enforce compliance and transform their entire enterprises using Aprimo’s platform.
Your customers trust you to handle that information ethically and intelligently. Because for them, it’s personal.