Ad Image

Regulation Can Only Do So Much: It’s Time to Build for Better Data Privacy

Solutions Review’s Contributed Content Series is a collection of contributed articles written by thought leaders in enterprise tech. In this feature, Inrupt VP of Trust and Digital Ethics Davi Ottenheimer offers commentary on why regulation can only do so much in the era of data privacy.

More than two-thirds of consumers globally are concerned about their online data privacy. It’s a statistic that shouldn’t surprise anyone — unless they expected it to be even higher.

Recognizing the need to set standards and hold companies accountable for responsible data collection, governments around the world have enacted a slew of privacy laws designed to protect our digital selves.

Regulation is a necessary step. The problem is that it’s often reactive. It comes in the wake of major privacy breaches and it’s designed to course-correct society away from the worst version of our future.

But while privacy laws are informative and valuable, they cannot deliver a working solution to online privacy alone. Laws are not technical enough to prescribe exactly how to deliver the kind of practical solutions that grant users peace of mind about how businesses handle their data.

So rather than scrambling to comply with every new regulation that emerges, companies should see these laws as an opportunity for innovation. Legislation everywhere sends the same essential message: Current data management systems and technical infrastructure are failing to deliver meaningful data control and transparency.

It’s past time for the industry to take the hint and start building better privacy. Re-designing how data is stored, accessed, and controlled will bridge the gap between a baseline of legal compliance and user-centric privacy engineering best practices. Ultimately, the result will be increased trust and security between companies and users. Better yet, we can build the foundation for more powerful, individualized services and products that people actually want to use.

Data Privacy Regulations

Where Data Privacy Regulations Fall Short for Consumers

As it stands now, a significant gap exists between what privacy laws are aiming to prevent and their practical implications for users expecting bigger changes to the privacy landscape.

Laws like Europe’s GDPR or California’s Consumer Privacy Act enshrine user rights like the ability to opt out of targeted advertising, obtain a copy of their data, or request that their data be deleted. But they don’t give instructions to engineering departments about how to make any of these actions inherently accessible and standardized for users.

Sharing personal information like name, phone number, and date of birth is the price we pay for the ability to access nearly any online service. Unfortunately, once a user checks the box agreeing to a website’s terms and conditions, there’s no going back. If they change their mind, the only reasonable next step is to email the company and request that their data be deleted. This can be likened to someone being dragged asleep onto a ship that sets sail. When they wake up in the middle of the ocean, the only two options are to remain a captive or jump overboard.

Even if consumers can remember every company that has copies of their personal data — from online retailers and financial websites to public utilities and everything in between — it requires a significant amount of time and energy to investigate, manually submit objections, and follow up on data privacy requests.

For users, that certainly doesn’t feel like transparency or control. It feels more like having no choice at all.

Meanwhile, companies tasked with complying with privacy regulations are getting stuck in proprietary and dead-end implementations. As new laws emerge and existing rules evolve, companies feel pressured into resource-intensive projects that reorganize consumer data to comply with the latest standards but don’t actually achieve basic safety and usability measures. A costly rearrangement of deck chairs on the Titanic is not an approach anyone really wants to see.

The growing number of laws correctly restricting how companies receive authorization to use customer information should be driving us all toward a new way of thinking about compliance. Instead of repeatedly requesting consent for new use cases and re-collecting data they already possess, what if companies could use a performance-oriented consent mechanism that also reduces risks to privacy?

Building Trust with Data Operationalized for Privacy

Both users’ and companies’ pain points can be solved by operationalizing data for privacy in a streamlined way that enables users to easily manage their shared data. But it requires adopting new technology that organizes data around people, not around applications or data warehouses.

A user-centric data architecture makes it possible to achieve user visibility and control over personal data because data isn’t fragmented across giant hidden silos. Instead, each user’s data is housed in their own personal data store they can access. In this setup, users have the ability to see how companies are processing or manipulating their data. They can grant or revoke consent at any time for individual use cases.

This model accomplishes the objectives of privacy regulations by providing users with transparency and control, while also benefiting companies by making data more accessible and actionable for business decision-making. The end result is increased trust between organizations and consumers because it respects users’ basic rights, such as ownership over and the liberty to control personal information.

Data that is operationalized for privacy achieves:

  • Control: Users gain the ability to exercise meaningful technology-based control over how their data is used and can make choices about consent at any moment. Simultaneously, companies gain the ability to access consumer information in a controlled ecosystem that eliminates the need to build complex data infrastructure that’s counter-productive to individual rights. For smaller companies and startups, users inherently maintaining control eliminates many barriers to entry that come from navigating complex privacy regulation solutions.
  • Transparency: Users can see what’s happening to their data at all times and gain a clearer understanding of how companies are processing it. Transparency enables them to make informed decisions about shared data, which helps them feel comfortable sharing their data for new purposes. Likewise, providing transparency helps companies eliminate persistent data silos while simplifying the process of gaining consumer consent to operationalize data for new purposes.
  • Trust: When technology delivers user control and transparency, consumers are no longer compelled, without real alternatives, to blindly trust companies’ claims that they are responsible stewards of personal data. This newfound sense of security encourages consumers to place more trust in companies when it comes to data usage, and removes exit barriers should users decide to opt-out down the road.

For instance, a healthcare provider may want to use AI to crawl consumers’ health records to surface potential risks that might otherwise go unnoticed, resulting in significant benefits for both consumers and providers. Still, many consumers would be understandably cautious about granting AI access to their records because of the risk of never being able to remove consent. But with data technology built for privacy in place, consumers are more likely to trust companies to access their data knowing they can benefit from more privacy-centric AI and change or revoke access at any point.

Delivering on the Promise of Privacy Requires New Data Technology

Privacy regulations worldwide emphasize what users value and need to make our society function best: control, transparency, and trust. These laws serve as important safeguards to spur innovations in technology, but they aren’t enough to ensure true privacy on their own without innovators building the next generation of tools.

To fully deliver on the promise of privacy regulations, we need new technology that is designed for privacy and puts control back in the hands of consumers. Vendors building this technology will foster trust between consumers and companies, opening the door for organizations to operationalize more data for more purposes and better decision-making.

Ultimately, it’s the synergy of legal frameworks and user-centric, privacy-first data technology that will bridge the gap between privacy regulations’ intent and the kind of practical implementation that will usher in both an important expansion of knowledge as well as higher levels of privacy.

Download link to Data Protection Vendor Map

Share This

Related Posts