It is not lost on U.S. lawmakers that competition legislation intended to rein in “Big Tech” would have far-reaching effects on the economy. What they perhaps did not bargain for is the impact such legislation would have on consumer privacy and security—even national security. ACT | The App Association and several aligned groups have pointed out to Congress that the American Innovation and Choice Online Act (S. 2992 / H.R. 3816) and Open App Markets Act (S. 2710 / H.R. 7030) would weaken existing security protections on consumer devices and platforms. It is perhaps less of an argument and more of an observation that undoing platform security mechanisms and mandating that consumers can sideload apps on their devices inherently invites the risk of malware, eroding the trust consumers place in their internet-enabled devices. We have said this more than a few times, and if Congress continues down this path, we will continue to call attention to the problem.
The Experts Agree
On the national security front specifically, experts including James Clapper and Leon Panetta wrote to congressional leaders in April about their concerns with legislation with nondiscrimination regimes in bills like S. 2992 and H.R. 3816:
“Legislation from both the House and Senate requiring non-discriminatory access for all ‘business users’ (broadly defined to include foreign rivals) on U.S. digital platforms would provide an open door for foreign adversaries to gain access to the software and hardware of American technology companies. Unfettered access to software and hardware could result in major cyber threats, misinformation, access to data of U.S. persons, and intellectual property theft. Other provisions in this legislation would damage the capability of U.S. technology companies to roll out integrated security tools to adequately screen for nefarious apps and malicious actors, weakening security measures currently embedded in device and platform operating systems. Our national security greatly benefits from the capacity of these platforms to detect and act against these types of risks and, therefore, must not be unintentionally impeded.”
Rep. Eric Swalwell (D-CA) is also concerned with H.R. 3816, releasing an op-ed on May 10 that echoed many of the same sentiments.
Legislation Should Not Force Platforms to Promote Disinformation Campaigns by Foreign Rivals
An important aspect of platforms is their ability to moderate the content users see[1] or, in some instances, restrict access to a platform altogether. The removal of apps/content from Russian state-owned media outlet Russia Today (RT) from assorted platforms including Apple’s App Store, the Microsoft Windows Store, Roku’s Channel Store, and Google’s YouTube in Europe and the United States is an example. As global brands also pull their business from Russia, banning the content of a network supported by the Russian government is how platforms are managing their liability through (1) protecting the perception of their brands and (2) ensuring they are not contributing to the Russian aggression problem. It is in businesses’ interests to do this, and it is in our national security interest to allow platforms the flexibility to do so.
Another effect of RT’s removal is that it clamps down on foreign rivals’ attempts to spread misinformation. Some have said that RT’s deplatforming was undemocratic, but Americans mostly applauded it. Platforms rightfully remove apps when they violate basic consumer protections in developer guidelines, though the question is often more difficult than whether to remove foreign state-backed propaganda. Parler’s temporary removal from the App Store received mixed reviews from politicians, but ultimately Parler was willing to commit to protecting its users consistent with App Store requirements. The same cannot be said for RT, which remains unavailable on the major app stores. The realities of balancing speech and security are complex both in keeping the U.S. government from stopping speech protected by the First Amendment and in companies managing their liability on these apps.
The relatively clear-cut case of RT’s removal from app stores underscores that a blanket requirement for app stores to host all apps would impose serious national security problems, while also making it a lot more difficult to perform day-to-day management functions. Our members at the App Association are primarily business-to-business operations. When their or their client’s apps are scraped, laced with malware, and sold on the same store, they rely on app stores to take down the fraudulent app. Mandating that a platform must host the valueless, malicious apps (domestic or foreign) alongside our members’ content is not the solution. The only ones that win in this situation are those with plans to profit illegally or promote misinformation for political gain at the expense of what is best for business and security.
[1] This often falls under the purview of Section 230 of the Communications Decency Act. For those unfamiliar with 230, this law provides immunity from liability for providers that publish the information of others online. In other words, platforms are not legally responsible for the content published by their users.