Blog

Dark Data Patterns are not the Future

Last week I talked about the need to understand the monetization and commercialization opportunities of your data assets. This week I want to touch on socially responsible versus dark patterns of data usage. This is a hot button topic for many and includes several facets from privacy to bias to ownership. In the United States, corporations seem to be in control of personal data while in the E.U., a legal framework (GDPR) exists to put that control in the hands of individuals. In other places like China, the government is firmly in control of the utilization or personal data.

Dark Patterns

Trust can erode quickly when people discover that their data has been used in ways that they don’t agree with, thought they were protected from, or modified in some way that is unexpected.

Popular social media companies have led the way in developing business models around data, many of which are arguably what I think of as dark patterns. I label them thus as these companies collect, use, and share data about their users in ways that the users have no knowledge of, or explicit consent for; the privacy policies are often vague and so complex that mere mortals have no chance of understanding how their information might be used.

There are other manifestations of dark patterns as well. One example would be silently and without users’ knowledge embedding tags, trackers, or other artifacts in user created content. I ran across an example of this just last week with a popular online grammar and spell checker. I discovered, while reviewing a blog post, that a number of tags inserted around various portions of the content that added a number of customer css classes and attributes. A web search on this particular tag revealed that a whole lot of sites are “infected” with this dark insertion. Sadly, I was unable to discover any declaration of what the tag does from the grammar checker site.

There is another dark pattern emerging in the world at the moment, opaque machine learning and artificial intelligence. Facial recognition technology is my favorite example of this. With only one exception that I’m aware of, this technology (and the data behind it) is so inherently biased and flawed that it is actively causing harm. Another example of data plus machine learning that exemplifies the dark pattern is the Compas system being utilized by some courts to assess defendent risk. This system is completely opaque, and in some circumstances, defendants are not even aware that a computer algorithm is determining their fate.

Socially Responsible Patterns

I would personally encourage anyone looking at monetizing or commercializing data assets ensure they are on this side of the spectrum. This will promote rather than erode trust and can lead to the discovery of new ways to use data in economically and socially beneficial ways.

Being socially responsible with your data strategy does not mean that you can’t monetize or commercialize it. Far from it. However, it does mean being upfront and transparent with your data and privacy policies. It also means understanding and empathizing with your audience. For example, it might be possible and even desirable to see more relevant content based on personal information, so long as that information is anonymized and not exposed in the process.

There are also cases where using and selling aggregated data can create social benefit. Therapeutic advancement and precision medicine are great examples of this. Individuals readily consent to having anonymized medical information used to help create new and better treatments, for example. The key being, consent. These people know up front what data is being collected and how it will be used.

The great news is that It’s not that hard to implement socially responsible data usage. The healthcare industry has had to do this with personal health information (PHI) in the U.S. as a result of HIPAA. Companies operating in Europe have had to implement controls for GDPR. While not perfect, there are at least patterns and processes that have shown how to protect data while using it to create economic and or social value at the same time. 

A Double Bottom Line

If you’ve read this far, you may have noticed the tie in between social impact and economic value. These two things constitute a double bottom line that will become inextricably tied to each other over the next generation.

It is clear that one can create outsized economic value for themselves using dark patterns, and that this comes with a negative social impact- both in terms of trust and on the social fabric itself. There are so many examples of this in the world today, I don’t even need to cite one here as you can likely think of half a dozen without even trying. Pursuing this pattern however, does not lead to any form of double bottom line, and worse, I predict it will ultimately lead to destruction of economic value. It will become a double liability.

What is perhaps not as clear based on history and case study, but which needs to be made self-evident, is that one can also create outsized value while also creating a positive social impact. The legal frameworks and regulation (at least in the Western world) are staring to catch up with the reality of the data economy. When they do, it will be the organizations that were forward thinking enough to practice socially responsible data patterns, building out the double bottom line of economic and social wealth that will be in the catbird seat.

Where do you want to be on this issue when the world wakes up to data? Crying mea culpa and playing catchup, or enjoying the fruits of an active double bottom line based on socially responsible data patterns. It’s time to evolve the business models of the data economy to realize this double bottom line.

Guard Against Dark Patterns

In Star Wars, the Jedi knights were the guardians against the “dark side”. In real life, it’s up to each of us to become guardians against the dark patterns of data exploitation. 

  • Justify any utilization of data via a socially responsible monetization and commercialization framework
  • Educate your employees and partners in data issues such as ethics, bias, and privacy
  • Determine acceptable use policies for the various types of data assets you have
  • Invest in systems and processes to protect the data (and your people)

To do this, you’ll need to take a few steps up front (I couldn’t come up with a pithy acronym for this one, lucky you):

  • Commit to being transparent with how you intend to use data and what your policies are
  • Build your data strategy with social responsibility as a core tenant
  • Set and enforce policies on acceptable use of data for your own organization

And finally:

  • Resist, resist, resist the temptation for short-term revenue via utilization of a dark patterns.

Categories: Blog, Digital Transformation, Enterprise, Modernization, Regulatory, Strategic Planning, Technology Strategy

Tags:

Frank Oelschlager
23 Oct, 2019