After nearly a year of little movement, Congress made major moves towards public drafts of bills to create a federal law on consumer data and privacy protection. This week ACT | The App Association is publishing a two part series on these legislative drafts and what the text really means for consumers.
At the beginning of 2019, many of us were busy mitigating our own great expectations that comprehensive privacy reform might see measurable progress this year. Conflicted, we expressed optimism and then dashed those hopes ourselves with a reminder that too many things can go wrong—which is better than staying naively optimistic till the end while the wiser wags say “told ya so.” Now, at the end of the year, some are saying that the recent circulation of separate Republican and Democratic proposals is a sign the naysayers in us all were right and that privacy won’t happen (not now, not ever!) Not so fast. These draft bills have important differences, but their publication brings into stark relief where each party needs to compromise, and suddenly the problem seems—if not manageable—less unmanageable. Not only that, both sides are willing to continue working toward a compromise. That, my friends, is progress.
Public reports have rightly focused on preemption and a private right of action as the two main sticking points. Sure enough, both the Democratic and Republican drafts diverge markedly in these areas. In a positive development, Senate Commerce Committee Chairman Wicker said earlier this week he’d welcome discussions on giving consumers the ability to seek an injunction (in other words, a private right of action with limited remedies). This is the first time he has publicly indicated a willingness to discuss a private right of action. The outer limit for Republicans is likely to be in the threshold an individual must meet in order to sue. The provision in the Democratic draft treats a violation of the bill’s prohibitions and requirements as an “injury in fact” for any consumer affected by the violation. Republicans are likely to see this as too open an invitation for the trial bar to sue for activity that may have no discernible effect on the litigant. Similarly, Democrats are not likely to accept a preemption provision that negates any state law provisions dealing with privacy—unless a federal privacy regime has several specific protections and provides a private right of action.
The steady drip of privacy headlines has helped. Slowly but surely, we continue to develop an understanding of the kinds of privacy-related activities that are offensive, and which are beneficial, in the mobile economy. The major news outlets’ focus on the egregious offenses have helped us identify the bad actors in a complex environment. For example, last year’s Your Apps Know Where You Were Last Night, and They’re Not Keeping It Secret pointed out some of the practices in which app developers shouldn’t be engaging. The article highlights situations where app makers were disclosing some of what they were doing with geolocation data but omitting highly relevant activities. Namely, “[a]n app may tell users that granting access to their location will help them get traffic information, but not mention that the data will be shared and sold.” That practice jumped out at me as probably illegal under current law. The Federal Trade Commission (FTC) Act bars “unfair or deceptive acts or practices in or affecting commerce,” and the New York Times was clearly describing a deceptive act or practice. But activity that’s already illegal—like misleading or false privacy disclosures—is driving much of the public discourse, in part because the FTC needs help policing it. It’s happening in new and opaque ways.
App developers who rely on advertising are probably most at risk of getting disclosure wrong—and not because they want to deceive consumers, but because the ad ecosystem is especially complex and not always well understood. Although a relatively small percentage of our member companies rely on advertising for their businesses, we provide resources for our members to ensure they understand the ad ecosystem and maximize its potential. That’s why we were happy the Network Advertising Initiative (NAI) released its opt-in guidance for NAI members, which is good advice for app developers too. The guidance is important because it addresses the potential slide toward perceived bad—or illegal—behavior when developers encounter a tough judgment call about how to disclose complicated data processing activities. While a comprehensive federal privacy regime is necessary, industry-led efforts to sort the details like this must accompany legal updates.
Along with industry-led best practices and potential updates to federal law, competition among software platforms is also producing better privacy outcomes. Policymakers should be careful here to appreciate the relationship between our member companies and platforms. Platforms have drastically reduced overhead and trust-building costs for developers, and they have created privacy protective environments that far exceed the capabilities of small companies. Unfortunately, almost simultaneous with urgent calls for platforms to improve their privacy practices, meaningful privacy steps taken by platforms are met with competition concerns. Ironically, better privacy controls are a sign competition is robust between platforms and evidence of product differentiation for app companies that leverage platforms. App developers’ clients and customers benefit a great deal from platform-level privacy controls that are clear and assume consumers have a privacy protective posture, especially when the data processing environment is complicated.
All these developments take place in a year some might look back on with disappointment, wondering what could have been. But from my perspective, 2019 has been a year of incremental but significant progress. Although maybe in limited ways, people are talking about it outside the beltway and that might just give Congress the political will it needs to achieve the seemingly insurmountable.