Print      
Click by click, we Internet ‘users’ are being used
By Woodrow Hartzog

The revelation that Cambridge Analytica was involved in the extraction of data involving over 50 million Facebook users has raised more than a few questions about just what went wrong and who is to blame. Facebook originally deflected the blame, saying, “People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked.’’ But this statement reveals precisely what has gone wrong with the entire digital ecosystem. The Cambridge Analytica debacle reveals that the system worked exactly as intended. We never stood a chance.

Sometime in the early 2000s, tech companies and lawmakers converged on a path that turned the Internet against you. The apps you downloaded, the screens you interacted with, and the devices you used were slowly but surely designed to ensure that you never stopped sharing and exposing yourself.

The root of the problem is counterintuitive. It’s all about the control. Specifically, our desire to have control over our data has been turned against us. In theory, it’s good to be able to control how our data are collected, used, and shared. It lets us optimize the risks and rewards of sharing and determine our own fate. Unfortunately, the control we’ve been given is a mirage. Companies are manipulating our perception of control, which causes us to share more and feel good about it in the process. Companies engineer our consent for even the riskiest of disclosures.

Of course, that’s not how it’s usually presented to us. Tech companies treat their requests for our permission to collect and use our personal information as though it were a gift. Mark Zuckerberg and other tech leaders embrace this narrative with statements like “What people want isn’t complete privacy. It isn’t that they want secrecy. It’s that they want control over what they share and what they don’t.’’

Lawmakers are also feeding this beast. They have created rules for privacy protections that revolve around the idea that companies respect our privacy so long as they get our consent and keep us informed. Tech companies get to say that they are simply complying with the rules set by lawmakers to give the people what they want: control. Despite all of this, we are not yet the masters of our own destinies. What’s missing are some basic rules about how digital technologies are designed.

Design is a lever of power. Industry will always use it. Even though we are called “users,’’ we are really just responding to our environment. We cannot click buttons and menu options that don’t exist. We cannot fully process all the warnings and pop-up notices we are confronted with, let alone the fine print. And we don’t always recognize attempts at persuasion and manipulation. Even when we do, sometimes they are hard to resist. Just ask anyone who made an impulse purchase on Candy Crush.

With everyone focusing so much on giving people control, policy makers have failed to create rules that keep technologies from collectively overloading our brains with requests for data and unfairly manipulating us. Even if every tech company perfected its privacy settings, people would still have hundreds, if not thousands, of other apps to deal with. Focusing too much on control leaves unanswered the difficult questions about the collective toll of manipulation, surveillance, and automated decision-making.

Tech companies are never going to stop asking for your data. Much like children who will ask for sweets until their parents relent from exhaustion, these companies are draining our finite ability to resist, and our choices are being shaped so as to make our exposure inevitable. The system allows companies to launder their risk onto users.

Until lawmakers fill the design gap and we all demand trustworthy technologies, industry’s unquenchable thirst for data will dictate how these companies build their technologies. Lawmakers and industry should seek design that does not manipulate us or dangerously expose us regardless of what we consent to. Where choice is appropriate, we should demand better choices, not more options. With a clearer vision for the design of safe and sustainable technologies, we can work together for a more common good.

Woodrow Hartzog, professor of law and computer science at Northeastern University, is the author of the forthcoming “Privacy’s Blueprint: The Battle to Control the Design of New Technologies.’’