This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Subscribe

Social Media Links

| 13 minutes read

12 Days of Data Privacy

2023 was a monumental year for privacy as we saw many U.S. states roll out privacy regulations and regulators cracking down on companies for violations. With the seemingly ubiquitous adoption of artificial intelligence (AI) across companies, the privacy world also raced to create measures that would ensure the equitable and safe adoption of these new technologies. 

As we close out 2023, we will cover 12 emerging privacy trends to be aware of as we head into 2024. We will be publishing one new trend each day of the 12 business days leading up to the new year.

#1 New U.S. Privacy Regulations 

Several U.S. states, including California, Oregon, Utah, Colorado, Virginia, Texas, Montana, Texas, Tennessee, Delaware, and Connecticut, have enacted data privacy regulations.  

On the docket for 2024, Wisconsin and New Hampshire plan to roll out their privacy regulations. These regulations will incorporate requirements that we see in existing state laws, such as publishing transparent privacy notices and allowing data subjects the right to know/access, right to correct, right to delete, and right to opt out.  

However, the Wisconsin Data Privacy Act (WDPA) will also require organizations to have data processing agreements (DPAs) with processors of personal data that establish the purpose, duration, and type of processing. The WDPA would be the first U.S. state privacy legislation to require DPAs.1

At the crux of compliance for these privacy regulations is the need to build a data inventory that keeps track of systems that store personal data and processing activities that involve personal data. This will allow organizations to more easily comply with data subject rights requests and to make accurate disclosures of personal data use and collection in their privacy policies. 

There are also a few U.S. health privacy regulations, such as Washington’s My Health My Data Act, the Nevada Consumer Health Data Law, and the Connecticut Data Privacy Act (CTDPA) Consumer Health Amendments, that will have broader implications in 2024.  

#2 European Union (EU) AI Act 

As an increasing number of organizations adopt AI technologies, regulators around the world look to balance its proliferative growth with privacy protection.  

No stranger to setting a precedent, the EU passed the world’s first broad AI regulation, the EU AI Act, this past Friday. Much like the GDPR set the standard for other countries passing their own data privacy laws, the rest of the world will look to the EU AI Act as a framework to base other laws off of. Since the Act has passed, several other similar AI regulations will most likely be passed by other governments.  

The EU AI Act has been in talks since 2019 but was recently expanded to assess how generative AI systems, like ChatGPT and OpenAI, should be governed.  

There has been much controversy over the Act as tech companies claim it will curb innovation and regulators are concerned about the widespread implications of AI on society. The law not only bans certain types of AI, like real-time biometric technologies, but also establishes transparency guidelines for AI companies to comply with. The Act aims to reduce potentially discriminatory harm to individuals and protect their right to privacy. 

The bans will be enforced in six months, the transparency requirements in one year, and the full Act in two years.2

#3 Emerging U.S. Health Privacy Regulations 

2023 saw the introduction of various new regulations around consumer health privacy. The most notable of these include Washington’s My Health My Data Act, the Nevada Consumer Health Data Law, and the Connecticut Data Privacy Act (CTDPA) Consumer Health Amendments.  

While there were some aspects, like geofencing requirements, of Washington’s My Health My Data Act that went into effect in 2023, the rest of the Act will be effective for non-small businesses by the end of March 2024 and by the end of June 2024 for small businesses. The law applies not only to healthcare organizations but also to any organization that conducts business in Washington that processes consumer health data.  

The My Health My Data Act defines consumer health data as “personal information that is linked or reasonably linkable to a consumer and that identifies the consumer’s past, present, or future physical or mental health status.” Since this definition also includes any information that can be extrapolated from non-health data, many retailers would also be in the scope of the law. For example, certain inferences drawn from purchases, like pregnancy status, could qualify as consumer health data.  

Compliance measures include not selling consumer health data without explicit consent, limiting the use of geofences, creating a consumer health data privacy policy, and making it available to the public.3

The Nevada Consumer Health Data Law (effective end of March 2024) and CTDPA (already in effect) have similar requirements, but while the pair are enforced by the Attorney General, Washington’s My Health My Data Act also includes a private right of action. This could have major litigation implications for organizations as no other health data protection law in the U.S., including the Health Insurance Portability and Accountability Act (HIPAA), currently has a private right of action. 

#4 FTC Health Breach Notification Rule 

The Federal Trade Commission (FTC) Health Breach Notification Rule is expected to have a notable impact on organizations as it requires organizations that are processing consumers’ identifying health information to notify consumers following a breach of unsecured information. In cases of breaches involving 500 or more people, this also requires media notice.  

This rule is in effect already and its implications are far-reaching for organizations as it applies to organizations that are not specifically covered by HIPAA, including any third-party service providers that deal with consumers’ identifying health information or send or receive information from a product that stores consumers’ identifying health information.  

Compliance with this regulation requires understanding data flow within an organization and implementing appropriate safeguards to ensure consumers’ health information is secured.4 

We have seen the FTC already bring action against organizations, such as GoodRx and Easy Healthcare. In the case of Easy Healthcare, the FTC fined them $100,000 for failing to implement appropriate safeguards while sharing health information with third parties when using certain software development kits.5

This action shows how widespread the scope of this law is and provides a sign to organizations that this is an area of focus for enforcement as we go into 2024. 

#5 CPRA Enforcement in March 2024 

In June 2023, the Superior Court of California for the County of Sacramento (Court) ruled that the final regulations of the California Privacy Rights Act (CPRA) could not be enforced until March 29, 2024, delayed from the original enforcement date of July 1, 2023. This is because the Court agreed to an enforcement date of one year after enactment, and the final regulations of the CPRA only went into effect in March 2023.6

Some key action items from the CPRA that need to be operationalized in the future include (1) performing an annual cybersecurity audit, (2) submitting privacy impact assessments for high-risk processing activities to the California Privacy Protection Agency (CPPA) annually, and (3) allowing consumers the option to opt-out of any automated decision making technologies. Requirements that are currently in place include (1) disclosure around the purpose for collecting information and the share/sale of personal information, (2) limitations on secondary use of data and sensitive information collection, (3) inclusion of opt-out links on websites and in privacy policies, (4) specifications on privacy policy disclosures, (5) “just-in-time” notices when information is being collected, and (6) notice to employees about information collected on them.7  

#6 California PIA Requirements 

The CPRA stipulates that organizations must conduct Privacy Impact Assessments (PIAs) for any processing activities that present a high risk to consumers’ privacy. All PIAs conducted must be submitted to the CPPA annually and organizations should also be prepared to provide PIAs to the CPPA upon request at all times. 

The CPPA requires that a PIA be conducted when there is targeted advertising, a sale of personal information, profiling, monitoring of personnel, monitoring of consumers in publicly accessible places, processing of personal information to train Artificial intelligence (AI) or automated decision-making technologies, or any other activity that would pose a risk to consumers’ privacy. At the very least, a PIA should contain (1) a description of the process, (2) the categories of information processed, (3) the context of collecting the data, (4) the consumers’ expectations around the purpose for processing their data (5) the elements being processed, (6) the purpose for collecting the data, (7) the benefits of the process, (8) the negative impacts to consumers by processing their data, (9) the mitigation steps the organization is taking to address the negative impacts, and (10) the organizations’ assessment of whether the negative impacts outweigh the benefits.8 

Other states with active privacy regulations, such as Colorado, Connecticut, and Virginia, also have PIA requirements.9

The first step toward compliance with this requirement is building out a data inventory that tracks the processing activities within an organization and includes questions that would identify if a PIA is required. Then the organization should conduct PIAs for those high-risk processing activities and submit the results in an abridged form to the CPPA.                      

#7 Canadian Privacy Regulations 

Canada has a federal privacy law, the Personal Information Protection and Electronic Documents Act (PIPEDA), in effect since 2001. Several Canadian provinces, including British Columbia and Quebec, have their own provincial data protection laws that take precedence over PIPEDA. 

A new bill, the Digital Charter Implementation Act (Bill), has been proposed and introduces new legislation to update PIPEDA with more stringent privacy measures, more like those seen in the General Data Protection Regulation (GDPR).  

Of note, it introduces the Consumer Privacy Protection Act (CPPA) which will modify PIPEDA by requiring organizations to have privacy management programs, protect minor data, and be transparent about any automated decision-making being used. The CPPA also grants data subjects certain rights, such as the right to have their data deleted or transferred to another organization. It also establishes a private right to action, which PIPEDA does not have currently. 

The Bill also proposes the creation of a tribunal to enforce the CPPA with the Personal Information and Data Protection Tribunal Act (PIDPTA). 

After passing through the House of Commons in April, the Bill is currently being reviewed by the Standing Committee on Industry and Technology.10 In Quebec, Law 25 will be enforced and includes requirements for a Data Protection Officer (DPO), Privacy Impact Assessments (PIAs) for high-risk activities, and disclosure of automated processing.11 For more information on Quebec’s Law 25, please see this article.  

Law 25 also requires opt-in, as opposed to opt-out, consent for collecting personal information, which deviates from other North American regulations. This means that cookie banners for a Quebecois audience will need to be configured to reflect this opt-in consent preference.12 

It remains to be seen whether other Canadian provinces will follow suit and implement opt-in requirements for information collection. 

#8 Frameworks for AI in the U.S. 

While the EU AI Act passed earlier this month, many other governments have also been introducing AI amendments or sections to privacy laws. For example, many U.S. state regulations require explicit disclosure/consent for the use of automated decision-making or profiling technologies while processing data. 

Of particular interest, the California Privacy Protection Agency (CPPA) published draft automated technology (ADMT) regulations. These ADMT regulations stipulate that CPPA will regulate AI use and that companies must allow the opt-out of profiling. It also would regulate AI use in activities such as employee monitoring and public monitoring, such as Bluetooth tracking and facial recognition.13

The National Institute of Standards and Technology (NIST) has also passed a framework, the NIST AI Risk Management Framework (AI RMF), that aims to provide organizations with guidelines to help them manage risks associated with AI. Similar to its cybersecurity and privacy frameworks, NIST’s AI RMF enables organizations to assess themselves against a series of metrics and benchmarks to help with risk prioritization and mitigation. It focuses on all aspects of the AI lifecycle and considers factors, such as accuracy, safety, resiliency, accountability, transparency, and explainability.14 

#9 Data Minimization 

With many new regulations limiting how long data can be retained, less may be more when collecting information. 

The New York State Department of Financial Services (NYDFS) stipulates in 500.13 of its New York Codes, Rules, and Regulations (NY-CRR) that an organization must have policies in place for securely disposing of any data that is no longer needed.15 

Although this regulation applies to financial institutions operating in New York, other regulations, such as CPRA and GDPR, also have similar data minimization requirements. 

This poses a challenge for organizations that may be storing information indefinitely as they would need to implement and operationalize a retention schedule to comply with these regulations in a defensible way.  

The first step to creating a data retention schedule involves developing an inventory of all systems that store personal data, the data elements stored within each system, and the corresponding business processing activities. This will provide the organization with a good understanding of why each data element is collected and how long each data element might need to be retained for the various processing activities. Lastly, they would need to operationalize the retention schedule by automating deletion procedures and educating employees on retention best practices. 

#10 Privacy Considerations for International Organizations 

As organizations expand operations overseas, compliance with international data privacy regulations becomes imperative. Companies with European Union (EU) exposure must comply with the GDPR. With an increasing number of countries passing privacy legislation of their own, companies conducting business in many other regions should check to see which laws are applicable. Of note, China, Brazil, the United Kingdom, Canada, Australia, Singapore, and India all have adopted their own robust data privacy laws. 

Common themes for compliance with these regulations include transparent disclosure of data use or sale, affirmative consent, and appropriate data safeguards. 

Many of these local regulations also have specific requirements when transferring data across borders. This applies to many organizations where data is hosted in multiple countries. In the case of the GDPR, for example, inter-company agreements, Standard Contractual clauses (SCCs), and Transfer Impact Assessments (TIAs) may be required for international data transfers. 

As of July 2023, U.S. companies can also self-certify under the EU-Data Privacy Framework (DPF) to allow data transfer from the EU.  

#11 EU-DPF Privacy Shield 

U.S. organizations can now self-certify for compliance with the new EU-U.S. Data Privacy Framework (DPF), with the ability to include a UK Extension to the EU-U.S. DPF and/or Swiss EU-DPF via an online form administered by the International Trade Administration (ITA) under the Department of Commerce (DOC). 

This comes after a new adequacy decision by the European Commission went into effect in July 2023 that allows EU personal data to be transferred to organizations that are consistent with the DPF - developed jointly by the U.S. Department of Commerce and the European Commission, UK government, and Swiss Federal Administration. 

The U.S. Privacy Shield is intended to serve as an adequacy decision allowing U.S. companies to transfer personal data from the EU without the need for additional GDPR legal transfer mechanisms, such as Standard Contractual Clauses (SCCs) or Binding Corporate Rules (BCRs). 

In addition to providing an alternative to these cumbersome GDPR transfer mechanisms, DPF certification also serves as a public endorsement that the organization is upholding privacy standards. Some European businesses also would like to see EU-DPF certification before conducting business with U.S. organizations and there could be a decreased deal cycle time due to decreased privacy due diligence for organizations with certification engaging each other. 

Certification requirements currently include ensuring that the organization’s Privacy Policy complies with DPF principles, providing an independent recourse mechanism to investigate complaints brought by data subjects, paying a fee to the International Centre for Dispute Resolution-American Arbitration Association (ICDR-AAA) as part of an arbitral fund that must be maintained by the DOC, submitting an application on the DOC’s website, and paying an annual fee to ITA. 

For more on the EU-DPF and how to prepare for certification, please see this article

#12 Role of the Data Protection Officer 

Article 39 of the GDPR requires organizations to appoint a Data Protection Officer (DPO) to protect the rights of data subjects and bridge the gap between Data Protection Authorities (DPAs) in the EU, data subjects, and the organization. 

The DPO’s responsibilities range from monitoring compliance to raising privacy awareness within the organization to implementing training and conducting Data Protection Impact Assessments.  

In March 2023, the European Data Protection Board (EDPD) created a Coordinated Enforcement Framework (CEF) that launched an initiative to assess whether DPOs within organizations had adequate resources to carry out their duties and were free from conflicts of interest within the organization.  

The 26 DPAs taking place in this initiative have been soliciting information from DPOs via questionnaires or by launching full investigations. The goal of the initiative is to better understand how to make the DPO role more impactful and help guide the EDPD in developing additional regulations around the DPO position if necessary. 

The CEF will publish its findings in a public report after the conclusion of this initiative in March 2024. 

For more on this initiative, please see this article

1. https://docs.legis.wisconsin.gov/2023/related/proposals/ab466.pdf 

2.https://apnews.com/article/ai-act-artificial-intelligence-regulation-europe06ab334caa97778770f5f57f4d904447 

3.https://www.atg.wa.gov/protecting-washingtonians-personal-health-data-and-privacy

4. https://www.ftc.gov/business-guidance/resources/collecting-using-or-sharing-consumer-health-information-look-hipaa-ftc-act-health-breach

5. https://www.ftc.gov/news-events/news/press-releases/2023/05/ovulation-tracking-app-premom-will-be-barred-sharing-health-data-advertising-under-proposed-ftc?utm_source=govdelivery 

6. https://content.mlex.com/Attachments/2023-06-29_4745C8U6094V3K3O%2FCU_34-2023-80004106-CU-WM-GDS_10a66e19-7726-4167-bfca-5c1591881c5f8.pdf

7. https://cppa.ca.gov/meetings/materials/20230203_item4_text.pdf 

8. https://cppa.ca.gov/meetings/materials/20230908item8part2.pdf

9. https://iapp.org/news/a/understanding-us-state-law-pia-obligations/ 

10. https://www.parl.ca/DocumentViewer/en/44-1/bill/C-27/first-reading

11. https://angle.ankura.com/post/102irgc/quebec-privacy-bill-64-law-25-requirements-in-quebecs-privacy-law-that-go-be 

12.https://www.publicationsduquebec.gouv.qc.ca/fileadmin/Fichiers_client/lois_et_reglements/LoisAnnuelles/en/2021/2021C25A.PDF  

13. https://cppa.ca.gov/announcements/2023/20231127.html 

14. https://www.nist.gov/itl/ai-risk-management-framework  

15. https://www.dfs.ny.gov/system/files/documents/2023/03/23NYCRR500_0.pdf  

© Copyright 2023. The views expressed herein are those of the author(s) and not necessarily the views of Ankura Consulting Group, LLC., its management, its subsidiaries, its affiliates, or its other professionals. Ankura is not a law firm and cannot provide legal advice.

Tags

article, cybersecurity & data privacy, data privacy & cyber risk, featured

Let’s Connect

We solve problems by operating as one firm to deliver for our clients. Where others advise, we solve. Where others consult, we partner.

I’m interested in

I need help with