Named one of Poets & Quants’s “40 Best Business Professors Under 40” in 2021, Sandra Matz, the David W. Zalaznick Associate Professor of Business at Columbia Business School, is a leader in the field of computational social sciences. She coined the term psychological targeting to describe the way digital traces can reveal human psychology.

In her forthcoming book, Mind Masters: How to Turn Nosy Algorithms into Powerful Allies, Matz shows how data enable external influencers to sway the choices we make individually and collectively. Yet, from her perspective, this can be positive. She suggests ways individuals can unite through common interests to leverage their data ethically, to understand themselves personally, and to find resources they intentionally seek.

In the following essay, Matz explains these concepts:
 

One Boy’s Day by Barker and Wright (Harper Bros., 1951), two psychologists set out to study actual behavior. They hired eight research assistants to observe a 7-year-old boy for 14 consecutive hours recording his natural “psychological habitat” at one-minute intervals. With the goal of producing an objective report of the boy’s behavior, they noted how he woke in the morning, played with his dog, rode his bike, and interacted with his parents, teachers, children of various ages, and adults in the community.

Fast-forward 70-plus years and now each of us has millions of observers based on all our devices, tracking us by the second, gathering data and recording it. This data-crawling digital equivalent to Barker and Wright’s research assistants reads my Facebook messages, collects my credit card purchases, and records my facial expressions and casual encounters using some 50 million public cameras across the United States.

In addition, not only can we collect more data today than ever before, computers can interpret it. They can translate this seemingly mundane, innocuous information about what we do into highly intimate insights about who we are and ultimately prescriptions of what we should do. I call this process psychological targeting and I’ve been studying it for over a decade now.

“I argue that we can’t fight this fight alone. No one has the knowledge, time, and energy to make protecting their data and policing third parties that use it a full-time job.”

- Sandra Matz, the David W. Zalaznick Associate Professor of Business

My path began in 2011, when I met two postgrads at Cambridge, Michal Kosinski and David Stillwell, who developed the Facebook app myPersonality in 2007. It went viral. In less than five years, more than 7 million people had taken a personality test. Then the researchers invited users to share information from their Facebook profiles with the app for research purposes. From that, they were able to generate an enormous data set combining people’s digital traces with insights into their psychology.

What Michal and David had stumbled upon suggested a real pivot in how we could study the human psyche. Instead of having to rely on observers, survey responses, and highly stylized lab experiments, psychology finally stood a chance to live up to its promise of being concerned with people’s everyday experiences and behaviors.

By the time I joined Michal and David in Cambridge to do my PhD, they had just published their first scientific article showing that computers could accurately predict people’s intimate psychological traits—like personality, political ideology, sexual orientation, and IQ—from the pages they followed on Facebook.

Since then, my colleagues and I have published numerous articles showing how computers can get to know you intimately.

But so what? What does it mean that computers can peek into our psychology and understand what lies below the surface of the behaviors they can observe? What does it mean for you and me? And for society at large?

It’s a question of power— understanding your psychological needs, preferences, and motivations gives others power over you. Power to influence your opinions, emotions, and ultimately behavior. 

It doesn’t take much imagination to understand that psychological targeting, in the wrong hands, could become a weapon. Cambridge Analytica made that abundantly clear in 2018 when it allegedly created psychological profiles of millions of Facebook users without their knowledge and then hit them with fear-mongering political ads tailored to their psychological vulnerabilities.

But let’s look at it from a different angle. This window into the psychology of millions of people that can change behavior provides a remarkable opportunity. My research, for example, has been used to predict and prevent college dropout, guide low-income individuals toward better financial decisions, and detect early signs of depression. It’s not only a chance to detect depressive symptoms early (before they develop into a full, clinical depression) but also to offer personalized advice or resources.

How do we amplify the positive sides of psychological targeting to make it work for us instead of against us? I argue that we can’t fight this fight alone. No one has the knowledge, time, and energy to make protecting their data and policing third parties that use it a full-time job. But we can design a system that helps us do just that.

I’m talking about establishing small communities designed to help you manage your personal data. These entities, data trusts or data coops, would be legally obliged (e.g., through fiduciary responsibilities) to act in the best interest of their members.

We need allies. Like-minded people who have similar interests and share the same goals. With about 8 billion people around the world, you can find someone with the same problems and ideas as you. Then you have the advantages of sharing data without the costs of losing your privacy and self-determination.