EY helps clients create long-term value for all stakeholders. Enabled by data and technology, our services and solutions provide trust through assurance and help clients transform, grow and operate.
At EY, our purpose is building a better working world. The insights and services we provide help to create long-term value for clients, people and society, and to build trust in the capital markets.
As regulatory enforcement ramps up and states enact their own laws, it’s time for organizations to re-examine their approach to privacy.
In brief
Data privacy is at the center of emerging legislation and evolving regulatory enforcement in the US and beyond.
Consumer demand and advanced technologies are intensifying the need for health care organizations to have robust data privacy protections.
Companies should compare their current privacy practices with applicable laws and regulations on a periodic basis and then resolve any deficiencies they identify.
As the data privacy landscape continues its rapid evolution, health care organizations are under more pressure than ever to adapt to new and emerging threats, as well as adopt new technologies and comply with new regulations. At the federal level, passage of broad legislation on data privacy appears difficult in the current Congress due to the narrow margins in the House, but executive action could reframe the federal government’s approach to artificial intelligence (AI) and data privacy. This is likely to intensify activity at the state level, resulting in a growing patchwork of data privacy laws that organizations must comply with. Organizations that don’t act accordingly to establish robust data privacy protections could make themselves — and their customers — vulnerable to significant risks. Fueling this pressure is a convergence of four key factors:
Four key factors behind the pressure to establish robust data privacy protections
Stronger consumer demand for data privacy transparency and protections, driven in part by high-profile privacy and security incidents
Agrowing patchwork of data privacy laws in the US, as states move forward with legislation in the absence of a comprehensive federal bill; as of the time of this publication, 19 states have passed data privacy laws, and three have a version of privacy legislation currently undergoing the legislative process1
A redefined federal scope that could narrow the government’s focus on companies’ approaches to data privacy to those covered by the Health Insurance Portability and Accountability Act (HIPAA)
The rapid evolution and adoption of technologies like AI
Within this complex landscape, strong data safeguards have become a non-negotiable business imperative. By having the proper measures in place, organizations that handle sensitive consumer information can protect their business — and more.
Strong consumer demand
Consumer demand has heightened the need for organizations to adopt a strong data privacy approach. Public awareness of privacy issues is growing, particularly in light of highly publicized privacy and security incidents at major brand names in health care and beyond, and consumers want more control over their data.
94% of consumers in 2022 deemed it important to have control over the information they share with businesses and how the data is used.
Source: Qonsent
77% of consumers in 2022 said a company’s transparency on data practices impacts their purchase decisions.
Source: Qonsent
86% of Americans in 2023 were more concerned about the privacy and security of their data than the state of the US economy.
Source: Forbes
67% of Americans in 2023 didn’t know how their data was being used and who has access to it.
Source: Forbes
Growing patchwork of data privacy laws
The EU’s General Data Privacy Regulation (GDPR), which took effect in 2018, set a new global standard for data privacy, inspiring similar legislation around the world, including India’s Digital Personal Data Protection Act, enacted in August 2023. According to the UN Conference on Trade and Development, over 70% of countries now have laws in place to protect personal data and privacy.²
In the US, comprehensive data privacy legislation at the federal level is a glaring gap in the privacy landscape, but efforts are underway to establish a national standard. In April 2024, lawmakers unveiled the American Privacy Rights Act, which would establish an enforceable national data privacy right standard that would preempt state privacy laws. The bill, which would treat health and medical information as “sensitive covered data,” would require companies to take specific measures to ensure data privacy and patient consent. However, the bill failed to advance out of committee and prospects for passage in the new Republican-controlled Congress are dim as key bill sponsors retire and Republicans in Congress focus on other priorities like tax reform, rolling back certain Biden administration policies, and controlling federal spending. Instead, we could see Congress continue to focus on more narrow data privacy standards regarding children or youth.
In the absence of a national data privacy standard in the US, a growing number of states have been advancing their own legislation.
In 2018, California signed into law the California Consumer Privacy Act. The state established a privacy enforcement agency, the California Privacy Protection Agency, in 2020. In December 2024, California signed into law the Physicians Make Decisions Act, which ensures that a licensed health care provider oversees any decisions regarding medical treatments, and that such decisions are not solely made by AI. The law will go into effect January 1, 2025.
In June 2021, Colorado signed into law the Colorado Privacy Act, which protects identifiable information and establishes five key rights for state residents around access, corrections, removal, portability and opting out. In December 2024, the state amended the law introducing new obligations around the collection and processing of biometrics and minors' personal data.
In 2023, the Connecticut Data Privacy Act went into effect, including amendment SB3, which introduced consumer health data requirements for businesses.
In September 2023, Delaware enacted the Delaware Data Privacy Act, which strengthened data privacy rights for residents, including adding an opt out process for personal data used in targeted advertising. The law takes effect January 1, 2025.
In September 2023, Florida signed into law the Florida Digital Bill of Rights, which adds new consumer consent requirements for certain sensitive data collection. The law excludes health records, health related data, and certain government and private entities, including HIPAA-covered entities. Most provisions took effect July 1, 2024.
In May 2023, Indiana signed into law the Indiana Consumer Data Protection Act, which adds new transparency and disclosure requirements for those processing personal data. The excludes health records and certain government and private entities, including HIPAA-covered entities. The law takes effect January 1, 2026.
In March 2023, Iowa signed into law the Iowa Consumer Data Protection Act, which adds new notice and opt-out requirements for businesses collecting or processing sensitive consumer data. The law takes effect January 1, 2025.
In April 2024, the Kentucky Consumer Data Act was signed into law. The law sets new data privacy requirements for certain entities that operate in the state or target state residents and manage personal data of at least 100,000 consumers per year. The threshold drops to 25,000 consumers per year for entities that derive more than 50% of revenue from selling personal data. The law takes effect January 1, 2026.
In May 2024, Maryland enacted the Maryland Online Data Privacy Act, which requires companies to minimize and protect the personal data they control and includes additional protections for children. The law applies to companies that manage personal data of at least 35,000 residents per year. The threshold drops to 10,000 consumers per year for entities that derive more than 20% of revenue from selling personal data. The law takes effect October 1, 2025.
In May 2024, Minnesota enacted the Minnesota Consumer Data Privacy Act, which establishes new consumer data privacy protections, including a provision allowing consumers to question automated profiling decisions. The law applies to large and medium companies that manage personal data of at least 100,000 residents per year. The threshold drops to 25,000 consumers per year for entities that derive more than 25% of revenue from selling personal data. The law takes effect July 31, 2025.
In May 2023, Montana enacted the Montana Consumer Data Privacy Act, which limits the collection of personal data to information considered “adequate, relevant, and reasonably necessary.” The law provides residents with the right to opt-out or decline the sale of their personal data. The law took effect October 1, 2024.
In April 2024, New Hampshire enacted the New Hampshire Privacy Act, which gives consumers the right to know what data a company collects and to opt-out of certain uses, including targeted advertising. The law applies to companies that manage personal data of at least 35,000 residents per year. The threshold drops to 10,000 consumers per year for entities that derive more than 25% of revenue from selling personal data. The law takes effect January 1, 2025.
In April 2024, Nebraska enacted the Nebraska Data Privacy Act, which gives residents the right to opt-out of their data being sold or used for targeted advertising or profiling, as well as the right to request corrections or data removal. The law includes several exemptions, including small businesses and federally regulated financial institutions. The law takes effect January 1, 2025.
In January 2024, New Jersey enacted the New Jersey Data Privacy Act, which establishes privacy protections over the ways companies collect and use consumers’ personal information. The law applies to companies that manage personal data of at least 100,000 consumers per year. The threshold drops to 25,000 consumers per year for entities that sell personal data. The law takes effect January 15, 2025.
In July 2023, Oregon enacted the Oregon Consumer Privacy Act, which includes protections for biometric data, sensitive and personal data and children’s data. The law took effect July 1, 2024.
In June 2024, Rhode Island passed the Rhode Island Data Transparency and Privacy Protection Act, which requires consumer consent before processing sensitive data and gives consumers in the state the right to confirm and correct the data companies can collect. The law also allows consumers to receive a copy and opt out of certain data uses. The law takes effect January 1, 2026.
In May 2023, Tennessee enacted the Tennessee Information Protection Act, which enables consumers in the state to confirm a business collected their personal data, receive a copy of the data and request corrections. The law takes effect July 1, 2025.
In June 2023, Texas passed the Texas Data Privacy and Security Act, which establishes new regulations for how businesses can collect, store, process, and sell consumer data. The law includes exceptions for most small businesses and took effect July 1, 2024.
In March 2022, Utah passed the Utah Consumer Privacy Act, which gives consumers the ability to access their data, request deletion and request a copy of data. The law went into effect December 31, 2023.
In March 2021, Virigina enacted the Virginia Consumer Data Protection Act, which gives consumers the right to access their data and request businesses delete their personal information. The law also includes new data protection assessment requirements for businesses. The law took effect January 1, 2023.
Washington Council Ernst & Young (WCEY) is a tax, legislative and regulatory group within Ernst & Young LLP that combines the power of a leading professional services organization with the on-the-ground knowledge, personal relationships and attention to detail of a boutique Washington-insider firm.
While the scope of data privacy laws in the US, including thresholds for applicability, varies by state, there are some common threads. Almost all of the enacted laws require opt-in consent for sensitive data processing and mandate data protection assessments.³ Most of the laws also establish consumer rights of access and portability, as well as rights to correct and delete data and opt out of sharing data for advertising.⁴
Redefined federal scope
The Biden administration took several steps, including a flurry of rulemaking, to broaden and strengthen data privacy enforcement efforts, resulting in higher penalties and costlier remediation programs. For example, in April 2024, the Federal Trade Commission (FTC) finalized changes to the Health Breach Notification Rule to regulate the handling of sensitive data more broadly. Now, vendors of personal health records and related entities — even those not covered by HIPAA — must inform individuals, the FTC and sometimes the media of a breach of unsecured personally identifiable health data. The rule was also modified to apply to non-HIPAA-covered health apps and other technologies.
However, this broader interpretation of statute may be revised under the Trump administration. New FTC chairman Andrew Ferguson and Attorney General Pam Bondi will put their own stamp on two agencies that play key roles in the enforcement of federal data privacy law.
While Ferguson has stated his support for the FTC’s role in protecting the privacy and security of consumers’ identifiable health information, he has been critical of the actions the FTC has taken under the Biden administration, including the Health Breach Notification Rule’s provision to expand oversight to non-HIPAA covered entities and the Commission’s use of Policy Statements.⁵ Ferguson’s leadership signals a shift to more traditional FTC operations and interpretations of statute. In his confirmation hearing in 2023, Ferguson stated that the FTC has an important role in protecting consumer health data, but has made it clear that action is needed from Congress to address complicated topics like privacy, data brokers, and AI and provide more guidance on the FTC’s oversight role.⁶ However, that is not a signal that FTC would ease enforcement in areas under its authority and HIPAA-covered companies should expect continued scrutiny on data privacy practices.
April 2024: FTC fined a telehealth company $7 million for the unauthorized disclosure of over 3 million consumers’ sensitive data, including PHI, to third parties for advertising purposes.
February 2024: OCR reached a $4.75 million settlement with a health system for data security failures that led to the theft and sale of patients’ PHI by an employee.
May 2023: FTC fined the developer of a fertility app $100,000 for sharing sensitive health data to third parties without notifying users.
March 2023: FTC reached a $7.8 million settlement with an online counseling service for improper disclosure of consumers’ PHI and identifying data to third parties for advertising purposes.
February 2023: FTC fined a telehealth and prescription drug discount provider $1.5 million for the unauthorized sharing of millions of users’ PHI to third parties.
January 2022: A health system settled a $18.4 million class action lawsuit for using analytics tools on health care services websites without obtaining user consent.
Source: Federal Trade Commission, Department of Health and Human Services, The HIPAA Journal.
And as data privacy regulations try to keep pace with new technology, the cost of compliance and penalties will likely rise, making it even more important for companies to evaluate and strengthen their data protection measures now.
Rapid evolution and adoption of AI
Adding to the complexity of the issue are the rapid evolution and adoption of technology, and the potential for the federal government to shift in how it views its role in overseeing AI.
President-elect Trump has vowed to rescind and replace President Biden’s executive order, which instructed federal agencies to create testing standards to evaluate privacy techniques used in AI and guardrails to protect personal data and prevent AI from being used in discriminatory ways. The second Trump administration is likely to revisit some of his initial executive actions on AI, which focused less on regulation and oversight and more on establishing US leadership in AI development. Therefore, it is possible organizations could see less clear direction from the federal government on ways to mitigate potential AI risks. In this environment, it will be incumbent on organizations to establish their own safeguards and internal policies for handling AI and related data privacy issues that continue to emerge.
Meanwhile, in Congress there is growing discontent regarding the use of AI and algorithmic software tools developed to guide prior authorization decisions in health plans. Bipartisan members of the House and Senate have sent letters to the Centers for Medicare and Medicaid Services (CMS) and commercial health plans encouraging increased oversight of the new technology. Growing public discontent of the use of prior authorization and health plans could potentially drive Congress towards passing reforms to restrict or govern the use of such tools. For example, California recently passed a law to ensure that a licensed health care provider oversees any decisions regarding medical treatments, and that such decisions are not solely made by AI.
High stakes
For health care organizations that fail to establish strong data privacy protections, the consequences can be major. An inadequate approach makes regulatory noncompliance inevitable, and companies can be subject to significant penalties.
Deficiencies in protective measures also expose businesses to breaches, making it easier for sensitive data to be shared inappropriately through various means and business practices. A cyber attack that involved ransomware and was detected at one of the US’ largest health insurance companies in February 2024 led to significant declines in revenue for many providers, even months later.⁷
But although cyber, financial and regulatory risks are a compelling enough reason to set up the appropriate safeguards, those aren’t the only threats that companies may face.
Also at stake is the valuable patient trust that health care organizations have worked to build over the years — trust that is becoming increasingly important.
According to a 2023 study, 71% of people find it more important to trust brands than they did in the past, and 73% place higher value on brands that strengthen their sense of safety and security.
Regulatory compliance and risk mitigation aren’t optional, especially in highly regulated sectors like health care. And companies across sectors can’t afford to jeopardize their relationships with customers. That’s why organizations must adjust their approach as the privacy landscape continues to evolve. In the health care sector, missteps can impact a company’s ability to provide its customers quality service or, in instances of lax data privacy protections, care delivery.
Taking action
Organizations must do several things to keep pace with all these developments.
Assess current privacy efforts against relevant regulations and laws based on the organization’s footprint.
Organizations will need to take this initial action to identify and develop a plan to address gaps in their privacy policies. Depending on your organization’s footprint, this assessment should encompass local, state, federal and global policies that could impact your organization.
To start, determine which regulations apply based on the jurisdictions of operation and the types of personal data that are processed, in addition to where data subjects reside. Once the regulatory impact assessment is complete, organizations should prioritize regulations and remediation based on impact and the potential risks for noncompliance. Some organizations have been evaluating the use of generative AI in rationalizing the requirements and creating granular, tactical implementation guidance for the business to execute privacy controls.
While companies are generally making progress in implementing capabilities for third-party risk management, training and awareness, and internal and external assurance, they may be struggling to address other significant data and privacy protection deficiencies. The most common, highly complex privacy capabilities where companies are finding the toughest obstacles include:
Data rights management: upskilling existing teams to support individual data rights
Privacy and security by design: incorporating controls at all points to consider the user experience, requirements and opportunities to create more privacy-protecting products and services
Notice and consent: procedures to routinely confirm the permissibility of data use based on prior notice and consent; unified consent management correlated to a single identity; maintenance of consent for different products and services across the organization
Cross-border data restrictions: detail on cross-border data flows is needed, as is a process to approve personal data export based on jurisdiction-specific requirements
Data mapping and inventory: insufficient data mapping; must involve an end-to-end process that includes structured and unstructured data
Ensure that routine practices align with the organization’s privacy program and goals.
Proper protection is also about making sure that routine practices align with the organization’s privacy program and goals. The following actions can help health care organizations as they work to strengthen their initiatives:
Establish an information classification system and corresponding safeguards for PHI and personally identifiable information (PII) data. Apply security measures depending on how the information is classified (e.g., confidential, internal), and have the designated asset owner assign data classifications.
Conduct periodic checks so that data is retained only for the minimum necessary period required to support business processes. The organization’s data retention policy should outline the duration of the period.
Collect physical documents containing PHI or PII data as soon as they are printed. Confirm that they are being used by authorized personnel only. These documents must be disposed of properly in accordance with the requirements defined in the organization’s information classification policy. If the documents are still in use by an employee who needs to step away, the employee must store them in a locked place.
Encrypt PHI and PII data when it’s either stored or in transit via a secure communication channel (e.g., HTTPS, VPN).
Establish and maintain risk governance protocols.
Every company should be appropriately organized to address data privacy. Everyone in an organization has a role in safeguarding data. Therefore, risk governance must extend to all levels of the organization, starting with the executive management and the board. Organizations should consider adopting the three lines of defense model. By making specific groups (e.g., risk-taking business units, risk and compliance functions, internal audit function) responsible for overseeing distinct aspects of the company’s approach to privacy across the three lines of defense, the organization develops a structure that can drive its ability to thoroughly protect data privacy.
Health care companies that are designing and implementing privacy programs should keep the following roles and responsibilities in mind.
A graphic shows the roles and responsibilities that different levels of the organization should consider when designing and implementing a privacy program.
Executive management and board
Approves the overarching privacy framework, methodologies, and roles and responsibilities
Leverages privacy information in the decision-making process
Evaluates business unit (BU) activities on a risk-adjusted basis
1st line — risk ownership (risk-taking business units)
Owner of the privacy processes
Identifies and mitigates risk
Designs and implements control
Designs technology capabilities to support data protection
Owns vendor management and contract reviews
Manage consumer requests to delete information
Manage notices provided to consumers that data is being sold & right to opt out
Adhere to disclosure timelines set by the state (e.g., 45 days to provide information following receipt of a verifiable request)
Defines employee segmentation abilities
Sets employee rights and consent
2nd line — oversight and monitoring (risk management)
Designs and deploys the overall privacy management framework across the organization (e.g., establish governance and accountability)
Monitors BU adherence to privacy and security frameworks
Compiles exposures across BUs and escalates risk and control issues to senior management
Assesses program design across the business units and seeks opportunities to align with requirements of the regulation
Monitors and tests compliance with regulations
Develops and monitors policies and procedures
Monitors risk-assessment-based compliance testing
Validates that there is no discrimination against the consumer because they exercised their rights
3rd line — independent assurance and validation (internal audit function)
Provides independent testing and verification of efficacy of business line compliance
Validates the overall privacy framework
Provides assurance that the privacy processes are functioning as designed and identifies improvement opportunities
Align privacy framework to leading standards.
Organizations seeking to effectively strengthen their privacy protections would benefit from aligning their privacy framework to leading standards on everything from accountability to the monitoring and enforcement of compliance with privacy policies and procedures. By acting in accordance with these proven approaches, they can drive change without having to reinvent the wheel. Incorporating leading standards helps establish an environment where privacy policies and procedures are well defined, documented and communicated; data is properly used and protected; and accountability is assigned.
The following framework details many of the leading privacy practices.
Aligning privacy framework to leading standards — 10 proprietary controls:
Management: The entity defines, documents, communicates and assigns accountability for its privacy policies and procedures.
Notice: The entity provides notice about its privacy policies and procedures and identifies the purposes for which personal information is collected, used, retained and disclosed.
Choice and consent: The entity describes the choices available to the individual and obtains implicit or explicit consent with respect to the collection, use and disclosure of personal information.
Collection: The entity collects personal information only for the purposes identified in the notice.
Use, retention and disposal: The entity limits the use of personal information to the purposes identified in the notice and for which the individual has provided implicit or explicit consent. The entity retains personal information for only as long as necessary to fulfill the stated purposes or as required by law or regulation and thereafter appropriately disposes of such information.
Access: The entity provides individuals with access to their personal information for review and update.
Disclosure to third parties: The entity discloses personal information to third parties only for the purposes identified in the notice and with the implicit or explicit consent of the individual.
Security for privacy: The entity protects personal information against unauthorized access (both physical and logical).
Quality: The entity maintains accurate, complete and relevant personal information for the purposes identified in the notice.
Monitoring and enforcement: The entity monitors compliance with its privacy policies and procedures and has procedures to address privacy-related complaints and disputes.
The concept of privacy by design, a GDPR requirement, is one of the leading standards that can help health care organizations enhance their data privacy posture. According to the European Commission, it involves taking “technical and organisational measures, at the earliest stages of the design of the processing operations, in such a way that safeguards privacy and data protection principles right from the start.” 9 By leveraging the following privacy-by-design framework, health care companies can set their privacy programs up for success.
A graphic depicts the Privacy by Design (PbD) Capabilities Framework.
Data subject experience
Notices — audience-appropriate privacy notices are provided in context
Collection — data minimization practices are applied and collection is relevant to user
Consent — transparent and appropriate to the proposed data and uses
Privacy defaults — privacy-respecting defaults are defined and implemented
Consumer rights — provide tools for data subjects to exercise their rights (e.g., to access, portability, deletion)
Consumer controls — provide an easy-to-access and user-friendly interface for data subjects to have meaningful, ongoing control over their personal data
Communications — tailored to match the capabilities and knowledge level of the target audience
Governance
Executive accountability and support is provided
Staffing and RACIs — roles and responsibilities are defined and individuals assigned
Policies, standards and procedures — are established and implemented
Training — role-appropriate training is provided and updated regularly
Regulatory engagement — mechanisms defined and maintained
Data use and management
Data minimization — requirements established and applied across the data lifecycle
Data classification — comprehensive classification of types of personal data processed
Data use cases — inventory of approved data use cases
Data and cookie inventory — established and maintained
De-identification
Data quality
Disposition framework — defined requirements regarding the treatment of data classes and elements across the data lifecycle
Security
Implementation of strong security measures into systems that process personal data, for every step of the data lifecycle, including:
Identity access management
Monitoring
Encryption
Incident response
Change management
Risk framework — supports the analysis of risk
Risk analysis — the processes and tools used to collect information and conduct the risk analysis (e.g., security review, TPRM assessment, PIA, DPIA)
Remediation — enabling monitoring of the implementation of recommendations
Development — guidelines, processes and tools to support dev and testing processes
Monitoring and reporting
External monitoring — monitoring of regulatory updates and common/leading industry practices
Internal monitoring and reporting — report on compliance and status of privacy program activities
Conclusion
Although the privacy regulatory landscape is still evolving, health care organizations that start assessing and strengthening their data privacy measures now will be a step ahead once the applicable legislation is enforced, whether it’s comprehensive national or state legislation. This proactive approach is necessary because compliance is an inevitable requirement. And while compliance is essential, it isn’t the only reason a business needs strong data privacy protections. Organizations must do everything in their power to avoid data breaches and other incidents that can jeopardize some of their most critical assets — their reputation and their relationships with patients.
This article was co-authored by Bridgette Harris, Rita MacDerment, and Sam Lanzino.
Evolving legislation, heightened consumer expectations and rapid technological advances, including AI, are intensifying the pressure on health care organizations to strengthen their data privacy. A patchwork of state laws and stricter federal oversight are necessitating comprehensive privacy frameworks. To mitigate risks, maintain patient trust and avoid substantial penalties, organizations in the sector must adopt proactive compliance strategies.
Using AI to speed health care transformation and add value will require executives to strategize current work and architect for the future. Learn more.