Members of ACT | the App Association have been advocating for a national privacy framework for years, and as the national conversation around health data evolves, that framework becomes more important than ever. The App Association represents an ecosystem valued at approximately $1.8 trillion—which supports 6 million American jobs. All of these businesses and workers would benefit from a comprehensive privacy framework. As the House Energy and Commerce Committee continues examining the privacy risks and the vast benefits for activities involving the collection and processing of consumer data not currently covered by a specific privacy framework, the App Association urges consideration of a few key principles.

Policymakers are appropriately concerned about the privacy and security implications of data collection and processing around the edges of the Health Insurance Portability and Accountability Act (HIPAA), Gramm-Leach Bliley Act (GLBA), and Family Educational Rights and Privacy Act (FERPA). Given that the United States lacks a comprehensive privacy law, the gaps between what consumers see as “health,” “finance,” and “education” are real. At the same time, digital health, financial technologies (FinTech), and digital education tools provide expanded access to higher quality, more cost-effective services across these critical industries. The activity taking place around these privacy silos is some of the most dynamic and beneficial economic activity the world has to offer, but the privacy and security sensitivities are also accordingly higher. They are the best reason for Congress to enhance federal privacy protections.

Here are three key recommendations for Congress to take into consideration as members continue their legislative work on comprehensive privacy legislation:

 

  1. Simply “expanding” HIPAA is a non-starter. HIPAA is an interoperability regime, designed for an incredibly narrow set of “covered entities” providing healthcare services to patients. Expanding that list to all entities processing data with any connection to health— like grocery stores—would turn the Department of Health and Human Services (HHS) and its sub-agency, the Office of Civil Rights (OCR), into a second Federal Trade Commission (FTC), but one with a staff of 72 already overseeing 6,000 annual complaints and convert much of the economy into an interoperable system required to maintain data for audit purposes.

 

The proliferation of health data outside the HIPAA umbrella, especially through consumer apps and wearable health technology, demonstrates that we need a broader privacy framework that can work in concert with existing HIPAA regulations. As we have seen through recent news stories about health companies sharing sensitive personal information or suffering data breaches, health privacy abhors a vacuum. Without specifically arming the FTC with the authority to enjoin privacy harms, evidence suggests adverse headlines will continue, although the FTC is making use of its current tools. The FTC’s recent consent orders show that it has prioritized punishing privacy and security abuses by digital health companies that may run afoul of FTC Act prohibitions on unfair or deceptive acts or practices.

In addition, HIPAA is a bad fit for direct-to-consumer digital health tools: the main purpose of HIPAA is to ensure interoperability between health providers so that a patient can port their health records across providers. Consumer-facing products and services with health-related aspects are fundamentally different from patient medical records and appropriately require a risk-based approach to privacy and security protections. For this reason, the App Association strongly supports updated language in ADPPA that would clarify that personal health information (PHI) is exempt from ADPPA’s requirements.

 

  1. Financial services go beyond GLBA and need a risk-based framework to better empower consumers. Like HIPAA, GLBA applies to a narrow, already-defined group of entities. But unlike HIPAA, GLBA currently lacks consumer data access requirements. The outcome is that the GLBA silo sometimes traps financial information, making it more difficult for consumers to understand and control their information. A risk-based framework can make it clear to the industry what can be done and spur innovation.

 

The app economy activity in and around the scope of GLBA is robust. Our FinTech member companies are solving emerging and long-intractable problems for consumers. For example, Goalsetter provides a financial education platform for children, which allows kids to receive allowance or monetary gifts from friends, parents, and relatives, and/or spend money through the Goalsetter debit card. Another kids’ digital wallet company, REGO, has gone so far as to patent the Children’s Online Privacy and Protection Act (COPPA) compliant opt-in protections in its Mazoola mobile wallet for kids.[1] Both of these FinTech apps put parents in charge and empower kids to learn financial literacy. With studies indicating that just over half of Americans are considered financially literate and only 24 percent of millennials understand basic financial concepts,[2] App Association members and companies like them are leveraging the power of smart devices and platforms to address this issue in privacy-protective ways.

Unfortunately, “the current data access regime involves a mixture of informal credentials-based access agreements and formalized, token-based access agreements. This system is complicated to navigate for both consumers and third parties and often allows traditional financial institutions to impose their will regardless of consumer welfare.”[3] These unnecessary levels of friction resulted in some FinTech companies playing fast and loose with consumer expectations, opting to “scrape” data from their banking screens in order to populate their apps.[4] Even though this was typically done to effectuate what the developers assumed was their customers’ intent, it never involved actual notification to the consumer and consent, because it was done outside the managed lines of communication and contract.

Financial institutions must enable safe, secure access—with appropriate data security and privacy guardrails—by customers to their own financial data via open APIs. Having established the overwhelming policy interests in enabling consumers to access their own financial information and transfer it outside the GLBA umbrella, an equally important task is to ensure consumers continue to benefit from optimal privacy and security protections outside the scope of GLBA. The answer must be a federal, risk-based privacy framework.

 

  1. FERPA overlaps with the FTC Act and its child privacy requirements, resulting in uncertainty for parents, commercial industries, and educational institutions alike. Instead of augmenting the risks these overlaps present by imposing age verification requirements or increasing data collection with a “constructive knowledge” threshold, a federal privacy law should modernize verifiable parental consent (VPC) requirements currently in place.

The statutory provisions Congress enacted through FERPA do not explicitly contemplate third-party companies providing digital education services using education records. However, schools routinely release education records to third-party education services companies—without incurring the requirement to obtain parental consent for disclosure—via the statutory exception allowing schools to provide such records to “school officials.” U.S. Department of Education (ED) regulations spell out the relationship more concretely. To qualify for the school officials exception, schools must determine whether a third party “(1) performs an institutional service or function for which the agency or institution would otherwise use employees; (2) is under the direct control of the agency or institution with respect to the use and maintenance of education records; and (3) is subject to the requirements of Sec. 99.22(a) governing use and redisclosure of personally identifiable information from education records.”[5]

The app economy is thriving in education technology. Thinkamingo is an educational app company focused on getting kids excited about writing. Their app, Story Dice, helps give kids ideas for stories, while their apps Lists for Writers and Story Spark help kids lay out their story, build out their characters and plot points, and give them the tools they need to improve their overall writing and story structure. Through contracts with school districts, Thinkamingo provides these tools for students in the school context, which puts their activities under the scope of FERPA. But to the extent that the apps are available to kids and parents directly, COPPA and the FTC Act are the federal laws covering their privacy practices. Another member company, TORSH, provides a platform for teachers’ professional development, enabling streamlined review, analysis, and management of classroom video clips.[6] The ability for schools to rely on digital tools like TORSH’s is critical for schools experiencing teacher shortages and striving to maintain their workforce.

The FTC’s COPPA guidance for education technology companies emphasizes that students should not have to trade access to digital education services for their privacy.[7] This messaging addresses rapidly developing privacy concerns over the past three years, especially among parents, as the COVID-19 pandemic caused schools to move to a virtual model leaning heavily on digital tools. Parents worried that their children’s mandatory use of those services would expose their children to undue privacy and data security risks, in an environment where no in-person alternative was available. Against this backdrop, the FTC sought to remind consumers and parents that the FTC Act—including COPPA—still applies to education technology companies. Most notably, the FTC reminded education technology companies that COPPA’s prohibitions on 1) conditioning access to a service on a child disclosing more information than is reasonably necessary for the child to participate in an activity; 2) engaging in commercial activities like marketing, advertising, or other commercial activities unrelated to the provision of the school-requested service; 3) retaining personally identifiable information (PII) about a child longer than reasonably necessary to fulfill the reason for which it was collected; and 4) failing to have procedures to maintain the confidentiality, security, and integrity of children’s PII, still apply to education technology companies, even when they also comply with FERPA.

As we look toward an increasingly digital future, privacy will only become more important. Any updates to the FTC Act or to COPPA protections need to include modernization of VPC requirements. VPC shifts the onus for privacy protection to parents rather than companies, especially in a world where parents have little choice but to enable their children to make beneficial use of digital services. Requiring multiple redundant copies of their PII to exist in all corners of the internet to which their children may need to venture becomes a less workable concept with each passing day. Additionally, any general consumer privacy legislation addressing kids’ privacy should avoid imposing age verification requirements or requirements that would require similar levels of data collection to “verify” or “assure” a child’s identity for age verification purposes. Requiring detailed PII profiles on children to exist in multiple parts of the ecosystem with every company providing services a child may access introduces more serious privacy and security risks than are necessary. In fact, such requirements may conflict with other privacy provisions of a federal bill, especially those that apply to more sensitive classes of information like biometric indicators.[8]

Each of these sector-specific silos present unique challenges as we work toward a national privacy framework. Gaps between them are often wider than they appear, and the work happening around the edges of these protections is some of the most robust in today’s app economy. Congress must continue its work on a national framework to ensure all data is covered appropriately, all Americans can access their data, and data security is prioritized.

 

 

[1] See mazoola: A kids mobile wallet powered by privacy, available at https://mazoola.co/.

[2] Kevin P. Chavous, “A Hand Up Or A Handout? Tackling America’s Financial Literacy Crisis,” Forbes (Feb. 3, 2022), available at https://www.forbes.com/sites/stopaward/2022/02/03/a-hand-up-or-a-handout-can-we-tackle-americas-financial-literacy-crisis/?sh=2258745fe251.

[3] Id.

[4] Benjamin Pimentel, “Banks and fintechs agree: It’s time for screen scraping to go. So what’s next?” protocol (Oct. 5, 2021), available at https://www.protocol.com/fintech/fdx-financial-data.

[5] 34 C.F.R. Sec. 99.31(a)(1)(i)(B).

[6] Torsh, Power packed features drive results, available at https://www.torsh.co/features/.

[7] FTC EdTech Policy Statement at 4, “Children should not have to needlessly hand over their data and forfeit their privacy in order to do their schoolwork or participate in remote learning, especially given the wide and increasing adoption of ed tech tools.”

[8] Eric Goldman, “Do Mandatory Age Verification Laws Conflict with Biometric Privacy Laws? – Kuklinski v. Binance,” Tech. and Marketing L. Blog, Apr. 8, 2023, available at https://blog.ericgoldman.org/archives/2023/04/do-mandatory-age-verification-laws-conflict-with-biometric-privacy-laws-kuklinski-v-binance.htm (“The invasiveness of [age verification] requirements could overwhelm and functionally moot most other efforts to protect consumer privacy.”).