By Max Gulker
Our economy depends on the continued forward march of technological progress. But with this growth come new problems and, inevitably, new regulation. We must ensure that this regulation does not stifle tomorrow’s innovations, whose details we cannot predict in advance.
In a recent New York Times op-ed, Kara Swisher zeroes in on the way the large tech companies on which we’ve come to depend violate our privacy. She writes that it’s time to “think harder about the trades we are making from the convenience we get from our gadgets. And maybe we put in place some rules — rules that have teeth — on big tech companies.”
Readers will likely concentrate on Swisher’s call for greater government regulation. Many of us favor less regulation, and others more. That’s an important debate, but in the world in which we currently live government regulators won’t be closing up shop anytime soon, nor will tech companies be nationalized.
Much of the near-term answer to how to protect our privacy online lies in the other regulatory force Swisher identifies. We as consumers must think harder about when and how we divulge private information. This kind of regulation truly has teeth in that it deploys billions of points of observation and response, which keeps mistakes limited and local and allows companies to continue to innovate and disseminate best practices.
Rather than assuming that online privacy lives and dies in their hands, government regulators should focus on how to create the right conditions to deploy this powerful decentralized force.
Teeth Cut Both Ways
Swisher imagines new government regulation of companies like Facebook and Google taking the form of new laws and agencies to enforce them:
The question, of course, is what those rules should look like. Some argue the United States needs a hefty, multipronged national privacy bill that would include more stringent requirements for how tech giants run their platforms. (Some even think there should be a new agency to oversee the industry.)
In the aftermath of Facebook’s fiasco with Cambridge Analytica and several smaller scandals involving Google, some might say the answer is simple: tech companies shouldn’t compile or sell the personal data they receive during the course of operations. But that is a fundamental part of what this generation of tech companies does.
Targeted advertising, finely tuned internet search, and customized product recommendations are the lifeblood of companies like Facebook, Google, and Amazon. The ability to use and profit from these data, theoretically anonymized at the individual level, is why so many of their services are “free” to consumers.
New regulations must therefore draw boundaries while allowing some uses of data not deemed as great a threat to consumers’ privacy. But some activities may be safe or at least tolerable in the hands of one type of company and problematic in the hands of another. As regulations grow more complex and arbitrary across lines of business and ultimately firms, familiar problems with government regulation also grow teeth.
Regulators want to remain independent from firms in the market, but cannot possibly be as knowledgeable as those firms. But with interaction between the two parties comes influence peddling. Market-leading incumbents are known to influence government to get rules that raise barriers to entry or yield advantages over existing competitors. With observers already concerned about tech giants’ market power, a flood of new top-down regulations could make the problem worse.
The dilemmas become even more vexing when considering innovation that hasn’t yet happened. Swisher’s call for regulators to act now before tomorrow’s technology becomes a threat to privacy actually exposes the problem inherent in doing so:
Do you like the idea of A.I. comparing your facial expressions to a company’s top and bottom performers during a job interview? I don’t. Do you want cameras in every device in your home? I don’t. Do you want your television-watching linked to your search history linked to your buying data? No, thank you.
Swisher is right to worry about the privacy implications of these technologies still in their infancy. But the applications she names are speculative; the only certainty is that there will be many uses we haven’t come close to thinking of yet. A complex web of inflexible rules will tie the hands of future innovators, throwing a wrench in the gears of the evolutionary process that makes technological innovation so valuable to society.
Knowledge Is Power
While Swisher may not approach the issues described above with enough caution, she does do a good job of stressing the importance of an educated consumer base. Relying exclusively on regulators to protect oneself against identity theft and other inappropriate uses of data is like leaving one’s car keys in the ignition because a would-be car thief faces the threat of jail time. No matter the strength of that regulatory backstop, a well-informed consumer also locks the door and pockets the keys.
Consumers must demand clear and accurate disclosure from tech companies regarding how their data will be used. Such disclosure should include taking steps to keep individuals anonymous, to limit internal uses of the data, and to govern whether and how companies can sell data to third parties.
Top-down rules from regulators on how data may and may not be used can prevent certain behaviors, but an educated consumer base causes tech companies to take consumer decisions into account to find optimal outcomes. Suppose Facebook earns substantial revenue from uses of data that many consumers, when informed of that fact, find unacceptable. In a market in which consumers are informed, Facebook could offer two tiers of service, one at which consumers pay a monthly fee for services, and the other at which consumers pay no fee but Facebook has greater rights to use consumers’ data.
Such multitiered solutions are unlikely to develop under a tight regulatory regime because such complex outcomes are far more difficult to enforce than simply putting a rule in place for everyone and preempting consumers’ choice. More generally, a system in which informed consumers interact with innovation by firms can reach outcomes that regulators could never formulate in advance.
Billions of Regulators
We live in a world of active government regulation. Using the arguments above to say we don’t need regulation of tech companies or that any regulation will be harmful may be theoretically satisfying to some but is not a realistic outcome of the current debate.
Government regulators can be part of the solution to privacy issues surrounding tech companies by drafting and enforcing clear rules requiring thorough, accurate, and straightforward disclosure of how they will use consumers’ data. Rather than having regulators guess what outcomes consumers want and risking curtailing future entry and innovation, this path for regulators helps foster an educated consumer base.
We are much better off with billions of decentralized regulators who own their own data and can make informed decisions than a few centralized bodies that guess which rules are ideal. Hopefully the centralized regulators we do have will take steps to empower the rest of us.
Sign up here to be notified of new articles from Max Gulker and AIER.
Max Gulker is an economist and writer who joined AIER in 2015. His research often focuses on free markets and technology, including blockchain and cryptocurrencies, the sharing economy, and internet commerce. He is a frequent speaker at industry conferences, especially on blockchain technology. Max’s research and writing also touch on other economic topics, including governance, competition, and small businesses.
Max holds a PhD in economics from Stanford University and a BA in economics from the University of Michigan. Prior to AIER, Max spent time in the private sector, consulting with large technology and financial firms on antitrust and other litigation. Follow @maxgAIER.
This article was sourced from AIER.org
Image credit: Pixabay