ATO in a Day
Could it be possible to complete the ATO process in just 24 hours?
Jason Hess, who was until recently the cloud security chief at the National Geospatial-Intelligence Agency (NGA), one of our premier intelligence agencies, spoke publicly about security compliance automation at a recent conference I attended.
Demonstrating a flair for branding not often seen in security professionals, Jason highlighted NGA’s assessment and authorization (A&A) process for getting unclassified software applications to a public cloud as NGA’s “ATO-in-a-Day.”
Scott Kaplan, Jason’s successor and current NGA cloud security chief, talked about the assessment and authorization (A&A) process at AWS’s recent public sector summit — stressing the need for the A&A process to meet the speed of the mission.
A bit of background: an Authority to Operate (ATO) is the security approval to launch a new IT system in the federal government — a senior official grants an ATO based on a risk-based assessment documented in a security plan. This well-meaning process is a result of legislation and a resulting standards-based federal framework, generally coupled with additional agency-specific requirements.
In practice, in many federal agencies, it takes a year or more to get an ATO, and the security plan can be hundreds of pages — a volume of information hard for a human to fully parse. The idea of reducing this down to a single day is an interesting business process challenge.
It’s also quite common for the specific steps to obtain an ATO to differ from person to person, which ultimately encourages everyone to take many unnecessary extra steps “just in case,” further prolonging the process without adding benefit. Further, the weight of the process strongly discourages ever changing a system, which would trigger the ATO process anew; this stagnation leaves most government legacy systems at great risk for vulnerabilities, not to mention lost innovation and productivity.
We need innovative firms entering the federal market — like those that Insight Venture Partners invests in — to make our government more secure, more effective at delivering services, and more efficient for the taxpayer.
Yet many companies are deterred not just by the sheer volume of steps required for approval, but by the manual and ambiguous nature of today’s process; a modern company that updates its code multiple times per day cannot fit in the “box” of today’s static ATO process.
This doesn’t make sense for the federal government or its taxpayers. It doesn’t make sense for the mission-focused public servants using IT in government — whether they are intelligence officers, first responders, environmental regulators, or public health professionals. And it doesn’t make sense for recipients of federal government services — who are increasingly using digital channels when they interact with government.
The NGA isn’t the only one recognizing the problem — the General Services Administration (GSA) is also taking note. In partnership with the White House, GSA is asking about approaches to speed and automate the security compliance process. GSA has also launched Project Boise, “to reduce the burden (time, cost, and pain) and improve the effectiveness of the federal government’s software security compliance processes.”
To add to the conversation, Insight Venture Partners and portfolio companies Tenable, Docker, Checkmarx, Thycotic, and Prevalent authored a whitepaper about automating security compliance — arguing that agencies should 1) use more standardized technology architectures, and 2) deploy an automated compliance solution. Read the full whitepaper here.
Regardless of technical approach, I’d suggest a few principles that federal agencies should consider:
- Reusable platforms, infrastructure and documentation, including a set of pre-approved architecture, technology stacks, and control implementation descriptions.
- Automate data collection where possible, with software platforms mapping information already maintained by existing systems to the concepts standardized in a common data taxonomy.
- Default to structured data, not documents. Documents should easily be created on-demand, as needed — but information should be represented as machine-readable and human-readable data.
- Portability / Reciprocity. An automated system for security compliance would be able to be used across federal agencies.
- Simplicity. An automated compliance system would be easy to use, easy to understand, and be as simple as possible.
- Transparency. All parties — including developers, operations, security, and senior officials — should be able to see the information they need, when they need it, appropriate for their role and level of access. It should also always be clear to all parties what steps are (and are not) required, in a consistent and replicable manner.
- Actionability. Based on the information they need, all parties would be able to take necessary action. For senior authorizing officials that are granting an ATO, they would be able to make an informed risk decision.
- Interoperability. Too many tools in the security ecosystem lack robust, publicly available APIs that can communicate with other security tools..
Boring but Critical
Like a lot of government modernization, this is boring — at least to most people! — but absolutely critical to make government work better.
If the Trump Administration is going to build on the Obama Administration’s efforts to modernize, it will need to transform how the federal government does security compliance.
The recent executive order, Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure, holds agency heads responsible for future breaches — and increases the spotlight on making informed risk decisions.
Let’s make it easier to make for agencies to responsibly launch great software — while making security more dynamic and integrated into the entire process of building and managing of software.
Let’s integrate security and compliance into the very beginning of how federal agencies buy and build IT systems — combining development, security, and operations (DevSecOps), rather than bolting on security at the end.
Read the whitepaper: Automating Security Compliance in the U.S. Federal Government.
ATOs must be a popular subject! In a week, this post got almost 2,000 views across Medium and LinkedIn, plus I was quoted in a FCW story about 18F’s Project Boise and FedScoop’s story about 18F working to overhaul the ATO process.
Francis Rose, host of Government Matters, also interviewed me about 18F’s efforts and my blog post; see the interview here or immediately below.
18F working to overhaul ATO process
Nick Sinai, former federal deputy chief technology officer, discusses 18F's effort to overhaul the way the federal…
I also learned more about how NGA and USCIS have made progress in this area.
Scott Kaplan, Chief of Cloud Security at NGA, wrote in public LinkedIn comments below the post:
- “We’re well on our way, in our most recent effort on putting apps through the DevSecOps pipeline, we’ve gotten it down to 5 days! We’re now working on automating the BOE (documentation) which should be the next big step towards ATO in a Day.”
When I wrote back and asked Scott if NGA was just automating the documentation, or if NGA was also moving to capturing and storing security information dynamically, he replied:
- “Both…goal is to continually update the documentation with the code, and being able to push that to the high side, where we’re currently using Xacta as our final Assessment tool. We’re also working to ensure that it is connected in to the continuous monitoring of the system, to allow the continuous ATO. Several companies (ION Channel, BlackSky, as well as internal folks), and most recently GovReady who’s helping DHS as well, are helping us achieve ATO in a Day.”
USCIS leadership in ATO automation
I also spoke with Sarah Fahden, the Chief of the Verification Program at US Citizenship and Immigration Services (USCIS). Sarah previously worked within the security group at USCIS and was responsible for managing the FISMA Compliance activities to achieve an ATO, including all security testing.
I was impressed to learn that, under Sarah (and CIO Mark Schwartz), USCIS has already tackled the problem — by streamlining the security technologies within the cloud infrastructure (AWS), automating the authorization processes, and standardizing the documentation.
Over the past few years, USCIS changed the static, document-heavy ATO process into a faster and more dynamic process — what Sarah calls “Ongoing Authorization.” Instead of needing to re-do an ATO every three years, a USCIS ATO is continuously valid until revoked.
Simple things, like IT system boundaries, had to be standardized and represented visually the same way, or else the security plan wasn’t easily understandable. Sarah instituted a detailed mock review of the security posture of each system owner, ensuring it included the specifics about how vulnerabilities would be remediated, before it went in front of CIO Mark Schwartz for ATO approval.
In addition, Sarah conducted a re-boundary effort of all systems across USCIS to remove duplication of work and created a process for systems to inherit controls from common security boundaries — such as data centers, CISO controls, and physical security controls. This reduced duplicative work across contractors and federal staff by more than 30 percent.
Like NGA, USCIS moved to standardized cloud infrastructure, which means that IT systems can simply “inherit” the controls of the underlying cloud provider. Why re-document something that security has already approved?
According to Sarah, USCIS uses Tenable’s Security Center, Splunk, DBProtect, fortify, and many other tools to create an automated and continuous monitoring process that constantly scans and updates the security posture of a system. This process generates a monthly scorecard which details the overall security posture for each system; USCIS’s Risk Management Board (RMB) then reviews the scorecard with the Security Officer, System Owner and Program Manager of each system. If a systems security posture begins to decline, that system’s personnel are required to meet with the USCIS CIO to provide a detailed remediation plan for fixing the issues.
These automated processes have freed up USCIS information security officers to do more impactful work that focuses on resolving security deficiencies, rather than writing documentation. Once a month, the information security officers review their always-current security plans, in addition to responding to meaningful alerts.
Beyond GSA, NGA, JIDO, and USICS, I’m sure there are other agencies also doing important work in automating the security process, baking security into the development process, and freeing up security professionals to do more important work.
Given the activity and interest in this topic, maybe the White House should convene agencies that are interested in or making progress on this issue?