5 privacy trends for 2023 (from a privacy startup CEO)

Privacy
Ben Brook
January 23rd, 2023 · 7 min read

Growth.

If I could choose only one word to describe privacy in 2022, it would be growth. 

In 2022, we watched as:

  • The top 10 largest GDPR fines reached a total of €780M,

  • 94% of organizations reported that privacy was a board-level issue, 

  • Sephora received the first-ever CCPA enforcement action, settling for $1.2M, and

  • U.S. businesses worked quickly to shore up compliance ahead of the three state privacy laws coming into force come 2023

As of January 1, two of these new laws—California's Consumer Privacy Rights Act (CPRA) and the Virginia Consumer Data Protection Act (VCDPA)—are already in effect. And the Colorado Privacy Act isn’t far behind, scheduled to take effect on July 1.

With these laws come a host of new privacy requirements. The obligation to provide opt-outs for both targeted advertising and the processing of sensitive information are what's top of mind for me. That these requirements have to be encoded across a company’s entire data ecosystem (rather than shallowly applied on a website) only adds another layer of complexity. 

2022 was a big year for privacy. But with new state laws coming into play, a brand new enforcement agency in California, the slow deprecation of third-party cookies, and the rise of new, powerful forms of generative AI—2023 is already shaping up to be one for the books. 

1. The California Privacy Protection Agency will crack down on Global Privacy Control enforcement

As 2023 progresses, we’re likely to see a massive uptick in Global Privacy Control (GPC) enforcement from the California Privacy Protection Agency (CPPA). 

As a refresher, GPC is a browser-based privacy preference signal. When turned on, GPC sends a signal to websites and platforms telling them you don’t consent to the sale or sharing of your data and that they should limit data collection.

For the first few years after GPC’s release, it wasn’t clear if this signal would be enforced. The California Attorney General (AG) had indicated support, but no enforcement actions were taken. 

That changed on August 24, 2022, when California AG Rob Bonta announced that cosmetics retailer Sephora had come to a $1.2M settlement with the state. 

According to the AG, Sephora had: 

  • Failed to inform consumers it was selling their personal data

  • Failed to honor opt-out requests made via GPC

  • Failed to address their violations within 30 days (the standard cure period under CCPA)

This settlement was a clear signal that California regulators intend to support GPC. 

And from an enforcement perspective, GPC compliance is an ideal area of focus. Where other aspects of compliance require deeper investigation, it’s actually relatively easy to audit whether a company is honoring GPC or not.

In fact, the enforcement examples provided by the AG’s office make it clear that many other companies were not honoring GPC. The difference is that those companies used the 30 day cure period to address their violation, whereas Sephora did not. 

This enforcement trend is only likely to increase through 2023, so if you haven’t already—my advice is to start working towards GPC compliance now.

2. Businesses will start enforcing consent preferences holistically

For the past several years, organizations have taken a fairly shallow approach to consent management. For many companies, consent management has been confined to implementing a cookie banner on the front page of their website. 

In the earlier days of consent, this was an effective solution, more or less.

But privacy regulations have evolved—now including browser-based signals like GPC and stricter language around the sale and sharing of data. Not only that, but the definition of tracking and data sharing has shifted within ad tech itself (think Apple’s App Tracking Transparency).

Combine this with the overall increase in enforcement (such as Sephora’s settlement with the California AG), and it’s clear that shallow consent implementations are no longer enough.

Companies need to go beyond managing only browser-based consent preferences and start looking for what we’re calling a “full stack” consent implementation. 

What this means in practice is that, yes, companies will need to continue providing a consent mechanism on their website. But more importantly, they’ll need to connect this experience to their backend infrastructure. 

This connection will allow them to transmit consent preferences across their entire data ecosystem, sync and store consent across same-site domains, ensure preferences are synced across all user devices, and enforce consent preferences downstream to other vendors e.g. data sent to Facebook Ads via Segment.

3. More alternatives to third-party cookies

As web browsers begin to block the use of tracking technologies and regulators increase scrutiny, third-party cookies will continue to lose ground. 

Firefox and Safari have already blocked third-party cookies, and Google plans to deprecate them on Chrome in 2024. This, combined with increasingly strict limits on the use of personal data for targeted advertising, means marketers and advertisers will need to start exploring alternatives. 

One recent report estimated that, once third-party cookies are fully deprecated, publishers will experience a $10 billion loss in ad revenue. More than that, Google reports that, without a new way to collect audience data, ad publishers could experience a 50-70% decline in revenue.

As this trend progresses, the market will fill with more third-party cookie alternatives. Right now, the top contenders are Google’s Privacy Sandbox—a collection of APIs that includes Topics, FLoC, SPARROW, and more—and various distributed ID systems. 

The implications of this shift are still up in the air, but as the year progresses and Chrome’s deprecation of third-party cookies grows closer, you can be sure more alternatives will come into play. 

4. Generative AI will continue to expand (potentially leading to a privacy incident)

2022 was a watershed year for generative AI. 

  • A Colorado artist won the state’s annual art competition using a piece generated by Midjourney,

  • Jasper.ai raised $125 million at a $1.5 billion valuation, and 

  • ChatGPT launched on OpenAI’s GPT-3 large language model. 

In 2023, there have already been rumors about the release of GPT-4, which is reported to work from over 1 trillion parameters, compared to GPT-3’s 175 billion. For reference, parameters refer to a neural machine’s overall complexity, which allows it to complete tasks like answering questions, translating languages, and generating content. 

The implications of these advances are far-reaching and we’ve really only begun to scratch the surface of how these technologies will affect the future.

As AI moves forward at a breakneck pace, the question of privacy and security becomes even more important. 

Generative AI doesn’t have the same rules of engagement as something like a search engine. They also lack a basic understanding of the world and aren’t yet bound by the complex content moderation rules that have become table stakes for search engines and social media platforms. 

With the launch of GPT-4 coming soon and the limitations of this technology in mind, it’s not a far leap to say that 2023 will see the first ever privacy incident regarding a generative AI platform. 

From what we’ve seen so far, such an incident could include helping to write code for a malware program, producing error free phishing emails, proliferating harmful misinformation, writing a prompt that helps an AI art generator create a realistic fake image of a person, and more. 

5. Security on federated web services will be tested

With Elon Musk’s turbulent takeover and the slew of issues in the months since (here, here, and here), Twitter has seen an unsurprising user exodus. With many folks searching for a viable replacement, thousands have moved to federated web services like Mastodon. 

This article has a detailed explanation of how federated web services work. 

The federated nature of platforms like Mastodon does have upsides—more individual control over what and who you interact with (if you don’t like your current server, you can move to another) and strong social pressure to adhere to a particular server's code of conduct. 

However, operating and engaging with such loosely organized platforms does come with risks. 

As Mastodon has grown in popularity, so have reports of significant security and privacy issues. The main risk being that most server administrators are volunteers—not dedicated privacy and security professionals. 

Though platforms like Facebook, Instagram, and Twitter aren’t without their own security issues, they do maintain robust security and privacy teams whose entire focus is platform and data integrity.

In the months since its launch, there have already been several security issues across multiple Mastodon servers. In Nov 2022 alone, it was discovered that:

  • A misconfigured server had allowed user data to be scraped from over 150,000 users 

  • A different misconfiguration made it possible to download and delete all a server’s files, as well as replace all users’ profile pictures

  • A vulnerability in one server made it possible to steal passwords through a malicious HTML injection

More than that, anyone can be a Mastodon server admin—all of whom have full access to personal information such as direct messages. 

The current security track record being what it is, it’s possible if not likely that 2023 will see a massive security breach on one or more of these federated web services. 

Conclusion

The privacy industry is in a constant state of movement and evolution, and there’s always something new to consider—that’s a huge part of why it’s such a fascinating field to build for. 

With three new laws going into effect, the CPPA getting up to speed, and new alternatives to third-party cookies, 2023 is set to be another interesting year. 

More than that, new AI developments and the increasing popularity of federated web services have raised plenty of questions about the future of technology as a whole and the integrity of new (and old) social platforms.

I can’t wait to see what happens next.


About Transcend

If your organization has been impacted by the California Privacy Rights Act's new employee data requirements or other consumer privacy laws, Transcend can help you ensure compliance. Learn how to fulfill employee DSAR with Transcend.

Transcend is the platform that helps companies put privacy on autopilot by making it easy to encode privacy across an entire tech stack.

Automate data subject request workflows with Privacy Requests, ensure nothing is tracked without user consent using Transcend Consent, mitigate risk with smarter privacy Assessments, or discover data silos and auto-generate reports with Data Mapping.

More articles from Transcend

Managing Employee DSAR Under CPRA [2023 Guide]

To effectively fulfill employee DSAR businesses will need to create a data inventory, develop internal policies, implement an identity verification process, and more.

January 6th, 2023 · 13 min read

Happy Holidays from Transcend

As 2022 comes to a close, we wanted to share a note of reflection and gratitude.

December 21st, 2022 · 2 min read

Privacy XFN

Sign up for Transcend's weekly privacy newsletter.

San Francisco, California Copyright © 2023 Transcend, Inc.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Link to $https://twitter.com/transcend_ioLink to $https://www.linkedin.com/company/transcend-io/Link to $https://github.com/transcend-io