Silicon Valley’s search and social media giants determine who sees what information, and how. Never before has such a small number of companies had the power to connect billions of people instantly—and with it, the ability to shape and alter the information ecosystems of entire societies.
Rob Reich is a professor of political science and codirector of the Center on Philanthropy and Civil Society at Stanford University. David Siegel is cochair of Two Sigma, a financial services company that uses technology and data science to optimize economic outcomes in investment management, insurance, and related fields.
At the 2019 World Economic Forum in Davos last month, numerous world leaders called publicly for greater international regulation around how data is collected and used. The tech industry has struggled to respond to this debate with a coordinated, constructive plan of action. If it doesn’t do so soon, the result may be overly blunt, rigid, and potentially counterproductive regulation. It’s not too late for the tech industry to help formulate rules that make sense for everyone, but time is running short.
The crux of the problem is the opaque process that determines how algorithms curate information for billions of users. Every time someone uses search or social media services, they’re relying on a secret and proprietary algorithm tuned to maximize something—usually user engagement with the service. Transparency and accountability are largely absent.
Experimentation and risk-taking are cherished hallmarks of Silicon Valley, but the norms around algorithmic governance have become a free-for-all. History teaches us that unregulated marketplaces can produce a race to the bottom, externalizing harms while socializing these costs and privatizing the financial gains. The financial crises of the 20th and 21st centuries demonstrated that unregulated markets cannot safeguard all interests of society. Now Silicon Valley’s search and social media giants, long resistant to oversight, face growing scrutiny. Too often, company-level efforts amount to a “trust us, the engineers are working on it” approach. These tactics have fallen short.
To protect the public interest and their own businesses, these companies should set up a robust self-regulatory organization along the lines of the Financial Industry Regulatory Authority (FINRA), an SRO that derives its authority from the Securities and Exchange Commission. Thanks to its independence from bureaucratic government agencies, FINRA is effective—and relatively nimble—at policing securities firms with sensible rules.
Twenty years ago, regulators faced similar challenges in the financial industry. Rules were often arbitrarily enforced and created an uneven playing field between larger incumbents and smaller players. Ultimately, through a partnership between industry and government, FINRA formed as a more agile and effective way to help protect the public interest. Industry’s involvement helped ensure that in-house technical expertise accompanied strong rule-writing and enforcement powers, reducing regulators’ reliance on blunt and infrequently updated laws.
The key advantage of strong self-regulatory organizations like FINRA is their ability bridge the gap between appropriately slow-moving governments and complex, fast-changing industries. Since FINRA is technically not a government body, it is better able to provide close, active oversight while keeping pace with constant shifts in the financial industry. At the same time, the government sanction FINRA enjoys is essential to avoid the appearance of creating a cartel, a concern that plagued its precursor, the NASD.
Promoting public trust in the integrity of information on search and social platforms is more crucial than ever. Search personalization and similar algorithms work well—they keep users engaged by delivering personally relevant content—but have a dark side: The way search results are presented and the order in which social media posts appear in a feed can manipulate public opinion and behavior. In effect, whether they mean to or not, these companies are inching toward the creation of a custom echo chamber for everyone on the internet—but there’s no governance or transparency around the process.
For example, it could help ensure that companies’ use of proprietary algorithms supports society’s fundamental interest in a high-quality information ecosystem, just as FINRA examines trading data to detect fraud. Crucially, it would do this without compromising companies’ valuable intellectual property or removing incentives to innovate. It could create clear rules about an independent appeal process when companies ban or delete information, and it could set forth requirements on algorithmic accountability.
It’s true that some companies have instituted their own policies on these issues—Facebook recently announced an effort to create an independent appeals process for its content moderation policies. But no framework applies to the industry as a whole. Self-regulatory organizations’ ability to balance the public interest with commercial imperatives should make a broad framework attractive to all stakeholders involved.
The passage of the EU’s General Data Protection Regulation and California’s Consumer Privacy Act signal that it’s time to change the way we approach governance of the online public sphere. We must seek solutions that avoid the pitfalls of clumsy legislation and signal the maturation of the tech industry as it comes to grips with its power.
If search and social media companies can’t figure out how to supervise themselves constructively, lawmakers are bound to step into the void more aggressively. Time is running out for industry leaders to take the initiative and build an effective oversight model themselves.
–
This article first appeared in www.wired.com
Seeking to build and grow your brand using the force of consumer insight, strategic foresight, creative disruption and technology prowess? Talk to us at +9714 3867728 or mail: info@groupisd.com or visit www.groupisd.com