Pursuant to a Presidential executive order issued earlier this year, the United States Office of Management and Budget (OMB) recently released a draft Zero Trust Strategy for the entire federal government. After a rash of cyberattacks targeting both the public and private sectors, the aforementioned order directs various federal departments and agencies to take steps to improve their security, including by implementing Zero Trust Architecture (ZTA) principles.
While there are many components to such a model, at its core it assumes that a breach by a malicious actor is “inevitable or has likely already occurred.” With such a premise in mind, those designing, deploying, and using information systems can strictly limit the access of such an actor, and as a result, the damage it can cause.
To accomplish this end, the OMB strategy covers five different functional areas: identity, devices, networks, applications, and data. Analyzing and implementing the requirements for each one would be a massive undertaking, so this post will focus on the final category: data.
Regulating access to data is, in the end, the most important part of information security. Whether they are state actors, criminals, or merely lone wolf hackers, attackers almost always seek to steal or deny their victim’s access to important data. In tackling the challenges inherent to managing the vast stores of information maintained by government departments and agencies, cutting-edge tools in the evolving field of data access governance can help protect against these threats.
The total volume of data maintained or even processed by the American federal government is likely both enormous and unknowable to any level of precision. A historical document from the National Security Agency (NSA) suggested that it processed 29 petabytes of data a day in 2013. Considering that this represents the daily activity of only one organization almost a decade ago, it is likely that the government’s current total data holdings are in the zettabyte or yottabyte order of magnitude.
Considering that this enormous quantity of information is stored across multiple cloud providers (not to mention thousands of traditional data centers as well), merely inventorying it is extremely difficult. The OMB strategy recognizes this fact, noting that “[d]eveloping a comprehensive, accurate approach to categorizing and tagging data will be challenging for many agencies.”
Fortunately, a new generation of intelligent tools has emerged to help overcome these problems. They include automated data discovery software that can identify and tag information using dictionaries of keywords, pattern matching processes, and heuristic models. These solutions can help federal information technology (IT), security, and compliance personnel to classify the vast array of data under their stewardship.
Once a solid inventory is complete, the OMB strategy requires federal Chief Data Officers to “automatically monitor and potentially restrict how these documents are shared.” Given the vastness of the federal IT infrastructure, having a single location from which to manage access control policies is another key requirement of any data access governance product. Industry leaders in this space are able to integrate with existing data sources through a variety of architectures. Furthermore, they can permit or deny access (and record these actions) at the column- and row-level of granularity to facilitate compliance with an array of regulations and other obligations.
Once they have inventoried and locked down access to their organization’s data, government employees will need to ensure its protection. To do so, the OMB strategy requires that “agencies must use independently operated [encryption] key management tools to create a trustworthy audit log of access to that data,” especially for that which is stored with commercial cloud providers. Having a robust tool that can use industry-standard cryptographic protocols to safeguard critical information is thus the final requirement of an effective data access governance offering.
Fully implementing ZTA for a series of networks as sprawling as those maintained by the United States government will be a significant effort, but there are immediate steps that it can take in the right direction. Choosing the right tool at the outset of any initiative can assist federal data professionals enormously. One that is able to discover and classify sensitive information, control and audit access to it, and then protect it at rest – across multiple cloud providers – will be “table stakes” for agencies seeking to implement the new OMB directive.
Learn more about Privacera here, or contact us to schedule a call to discuss how we can help your organization meet its dual mandate of balancing data democratization with security to maximize business insights while ensuring privacy and compliance.