Earlier this week, President Biden wrote an op-ed calling for regulation of the tech industry—starting with strong privacy protections for consumers. On the one hand, the President’s focus on privacy is welcome, because unexpected uses of data are among the most common and insidious ways that bad actors violate consumer trust. But the op-ed’s focus on digital platforms that “promote their own products while excluding or disadvantaging competitors” may signal a willingness to support policies that prohibit key platform privacy functions. Those policies would undo efforts to improve trust, so the op-ed’s allusion to them is cause for concern.
Privacy laws are about ensuring that consumers control their data—who has it, what they do with it, to whom they give it, and how long they keep it. Without a strong, national privacy law that responds to the current landscape, bad actors will continue to abuse the data they control. Accordingly, a federal privacy regime would help—rather than hinder—platforms’ ability to screen out bad actors and ensure that children are kept safe in a rapidly changing online environment. The Children’s Online Privacy Protection Act of 1998 (COPPA) was an important step in protecting young people, but we need updates to address the many ways children experience digital offerings today. Protecting children can and should be part of a larger data privacy bill in Congress.
Members of ACT | The App Association strongly support federal privacy standards and want to ensure that platforms still have the power to vet potential new apps to ensure their trustworthiness. Small developers without brand recognition depend on this trust to help consumers choose the apps they’ve developed. Bills like the 117th Congress’ Open App Markets Act (OAMA) and the American Innovation and Choice Online Act (AICOA) would upend that vetting process by forcing digital platforms to host apps that break privacy and other rules. Requiring platforms to host apps that ignore privacy rules, engage in unfair or deceptive practices, and otherwise harm consumers is not the way to build trust in tech. In fact, there are at least a couple recent examples of companies settling with the Federal Trade Commission (FTC) over consumer protection law violations, which OAMA and AICOA would otherwise force the app stores to carry by default. Meanwhile, Congress already supercharged federal antitrust enforcement last month with baseline budget increases totaling almost $80 million along with an additional $1.4 billion or so over five years from an authorized increase in merger filing fees. We should see how federal enforcers plan put this historic cash infusion to use before we dramatically alter the scale, scope, or purpose of antitrust law.
The App Association supports competition, which, in the app economy, has produced secure and privacy protective software distribution options. Congress should require high standards for data collection, storage, and use, rather than stripping the vetting mechanisms consumers rely on to inform themselves about these aspects of the market. As the recent Epic Games settlement with the FTC shows, many companies will ignore privacy laws to increase profits, and apparently seek ways around platform efforts to limit those abuses. The federal government should continue to punish these bad actors while ensuring strong privacy protections for all, whether consumers are on the open internet or browsing the app stores.
We need to build trust between consumers, platforms, and app companies. Strong privacy protections will help to build that trust and forcing platforms to host bad actors will not. Congress should focus its efforts on creating strong privacy protections for all and enable competition to flourish on the foundation of those rules.