10 Best Practices for Improved Data Privacy and Governance

LinkedIn
Twitter
Email

By: Syed Mahmood & Liz Tippitt

In our data-driven, modern computing world, organizations use data to better understand their customers, improve their competitive edges, and to seek out new, innovative solutions to better serve their customers. With all the advantages data can provide enterprises, it is essential the way in which enterprises interact with data respects the privacy, governance, risk, and compliance parameters required by privacy and industry regulations like CCPA, GDPR, LGPD, HIPAA, and more. For a deeper dive of the importance of effective data governance strategies, read this blog.

In honor of International Data Privacy Day, our team put together our list of top ten best practices enterprises can implement in order to accelerate secure democratization of data across business units, while ensuring it remains compliant with privacy and industry regulations to avoid any legal, financial, or brand repercussions.

1. Know where your sensitive data lives

The journey to good data privacy and governance begins with the knowledge of where sensitive and personally identifiable data are hiding amongst your on-premises and cloud data sources. This is a challenging task for any enterprise, as sensitive data can reside in granular form in transactional databases that power applications like CRM systems to aggregate data in data warehouses for decision support and modeling. Today, the volume of data flowing into any modern enterprise is so large that it is impossible for one person–or even one team–to analyze for sensitive attributes.

Enterprises must employ a combination of techniques, including sophisticated rules, pattern matching, dictionaries, algorithms, and machine learning models, to understand the context of sensitive data and accurately classify it.

2. Classify your sensitive data and implement tag-based access policies

Once sensitive elements in data are identified, enterprises need a way to classify them. Data classification is more than academic exercise and provides the foundation to apply tags to sensitive data. These tags help data stewards identify and keep track of sensitive data as it is moved or copied from one data source to another. The utility of data tags does not end there, however, as they are used to control users’ access to data. For example, if a column in a table that lists employees salaries has been tagged as containing personally identifiable information (PII), administrators can write a policy allowing only personnel associated with Human Resources to access that column. Furthermore, tags can be used to encrypt sensitive data. In addition to role- and resource-based policies, enterprises need a data access governance platform that provides visibility into data classifications and enables policies based on tagged data.

3. Implement centralized access control across your data sources

With the advent of cloud computing, it is not uncommon for enterprises to subscribe to multiple services from public cloud and third-party providers. Each public cloud service provides its own version of access control and management, and third-party cloud services like Databricks and Snowflake have their own mechanisms for controlling data access. Administrators are forced to learn multiple mechanisms to authorize user access and navigate to disparate interfaces in order to administer and enforce access controls. This leads to long delays before users can be provided with access to data they need to do their jobs.

With this proliferation of cloud data services, it is essential enterprises adopt a data access governance platform to centralize authorization of users’ data access, implement consistent policies, onboard users faster, and enhance data teams’ productivity. A unified data access governance platform also empowers internal and external auditors to view data access policies across services in one location to validate compliance with privacy and industry regulations.

4. Implement fine-grained data access controls

With fine-grained access controls, data administrators can create policies that restrict data access for unauthorized users, enabling data teams to: securely access only available datasets they have permissions for, alleviate time-consuming processes of manually requesting access from each separate data owner, and deliver high-quality insights and analytics faster.

The richer the data access governance platform’s ability to administer policies to finer grains of data, the easier it is for infrastructure administrators to grant access to the precise data users need to do their jobs. Ensure that your data access governance platform provides:

  • The flexibility to define access policies at a database-, table-, column-, or file-level
  • Comprehensive controls, like role-based access controls (RBAC), attribute-based access controls (ABAC), and tag-based policies
  • The ability to implement dynamic data masking and row filtering to protect data from unauthorized to access (e.g., sensitive data or personally identifiable information, such as credit card numbers or social security numbers)

5. Audit your data access policies frequently

Even enterprises with comprehensive data access policies in place can lack clear visibility into the effectiveness of those policies. In this case, it is nearly impossible to prove to internal and external auditors that data access policies comply with industry regulations and best practices, such as GDPR, LGPD, CCPA, HIPAA, and more. To avoid any compliance or privacy violations, enterprises need comprehensive audit frameworks to gain rich contextual metadata, as well as track how data is accessed and used–including what resources are accessed, IP addresses, locales, specific policies invoked, and actions taken for each access request across all services.

With the threat unauthorized data access poses to enterprises, these auditing and reporting capabilities must be real-time, so auditors and administrators can use pre-built reports, as well as raw event data, to perform additional analysis and post-mortem of unauthorized events.

6. Make data governance part of your business processes early

In order for data governance to be effective, it must be part and parcel of enterprises’ business processes. Until recently, administrators manually wrote and managed individual access policies, requiring them to maintain intimate knowledge of the underlying data engine’s authorization model, such as identity and access management (IAM) policies. Enterprises should adopt data governance capabilities that seamlessly integrate with existing business processes and workflows to automatically generate entitlement policies for new users and update existing policies based on users’ requests. With data governance integrated into business workflows, data administrators are empowered to apply data access policies consistently for all members of an enterprise and ensure faster access to authorized data.

7. Ensure your data is encrypted (at rest or in motion)

Throughout various industries and geographies, enterprises must navigate new attack vectors and risks associated with protecting their sensitive data. The three fundamental encryption methods (full volume encryption, file- or data zone level, and attribute- or field-level) are critical in order to protect data against these attack vectors and serve different purposes for different use cases. To ensure comprehensive visibility and control of sensitive data, enterprises should adopt a data access governance platform that implements all three encryption methods to enable the safe access of their data across their enterprises, while maintaining stringent compliance with the various privacy regulations and standards.

8. Migrate your pre-built, on-premises access policies to the cloud

If your enterprise invested significant time, effort, and resources to build unique access control policies for on-premises data repositories, it is redundant and time-consuming to recreate those same policies in the cloud. For example, data access governance platforms based on virtualization technology forces data analysts and scientists to rewrite queries from scratch to point to the virtualized layer. This is an unnecessary burden on data consumers that significantly delays data analysis. Enterprises should invest in a data governance platform that preserves the time and effort they have invested building policies for their on-premises repositories. Seamless migration of on-premises policies to public cloud services will ensure consistency across hybrid implementations, reduce the proliferation of policies, and accelerate user access to data in cloud services.

9. Remove the burden of managing access control infrastructure from your IT team

Data access governance platforms require IT teams to download the code, install it in their environment, configure the environment to connect data sources, then invest in updates, upgrades, and maintenance. All these tasks require resources with specialized skillsets to keep the platform operational. The burden of provisioning the environment, applying critical patches and updates, and performing infrastructure maintenance leads to overstressed IT teams, which can result in inconsistent policies and unauthorized data access. Enterprises should consider subscribing to a fully-managed service hosted on the cloud to provide data governance capabilities across cloud services. This will empower IT infrastructure teams to focus on value-add activities by delegating tedious tasks like provisioning the infrastructure or updating and maintaining access of the governance platform to the service provider.

10. Select architecture to optimize performance

A number of data access control tools currently available were originally built for another purpose (e.g., data virtualization platforms originally developed to provide data analysts and scientists access to data from a number of sources). Whether a solution was built specifically to define and administer data access control policies or not has several important implications. Virtualization products insert an additional layer between analytics services and data storage and perform extensive data processing. When used as a data access control solution, all users’ requests to access data are routed through the virtualized data access point, which negatively impacts the system’s performance, as all queries are required to go through the virtualized data layer. Additionally, metadata must be recreated in the new solution, creating substantial overhead for IT teams who must then rewrite client applications in the new virtualization platform.

Enterprises should perform due diligence to determine the origin of the data access governance platform before they adopt it to manage their users’ access to data. Thoroughly investigate the platform’s performance during the evaluation phase by simulating data volume and user request scenarios.

To learn more about how Privacera helps enterprises centralize data access governance across multi-cloud and hybrid cloud environments to ensure privacy and compliance, contact us at [email protected], or read our whitepaper to learn more about how to evaluate data access governance platforms.

LinkedIn
Twitter
Email

Contact Privacera for a Data Governance and Security Demo Today