Securing the Status Quo
The Effect of Federal IT Security Policies on Innovation
- Table of Contents {:toc}
The United States Federal Government is the single largest purchaser of information security products.1 In FY 2011 alone, a mere 24 agencies2 reported a combined IT security budget of $13.3 billion, employing the equivalent of some 84,426 full time employees with major responsibilities for information security.3 The gravity of the ongoing threat to our nations cybersecurity has been described in the starkest of terms at the highest levels of government. Describing our current situation, President Obama remarked that [t]he status quo is no longer acceptable4, members of congress from both parties have characterized the risk as a catastrophe in the making,5 and Secretary of Defense Leon Panetta warned of an imminent digital Pearl Harbor.6
Yet, despite all these histrionics, successful attacks against the nations most hardened systems are perpetrated each day. In 2005, for example, a lone hacker in the United Kingdom was able to remotely render more than 300 computers at an American Naval Station completely inoperable, by simply deleting a single file.7 In 2011, foreign intruders gained access to more than 24,000 files including those concerning the military’s most sensitive systems surveillance technologies, satellite communications systems, and network security protocols,8 and as recently as October of 2012, the Chinese government attempted to infect computers within the White House office responsible for maintaining the Presidents nuclear launch codes.9
At the same time, the government computer systems that agencies rely on for many internal processes are often years behind the technology readily available to consumers who can run entire sites form the mobile phone resting within their pocket. Whereas once the federal government pushed the capabilities of information systems — using the first computers to calculate projectile trajectories during the cold war or interconnecting the first machines to give rise to what is today the internet — today government and innovation are far from synonymous. Over the past few decades the IT industry has undergone a radical transformation towards consumerization, a transformation that has largely left the public sector behind. Whereas once deploying a technology solution would require months of upfront planning and investment, today lean, iterative, and decentralized solutions dominate the marketplace — technologies ill suited for adoption within the governments paperwork-laden security regime.
Current government security restrictions rightful seek to protect large, mission-critical systems. Yet such efforts often come at the cost of significantly hindering the adoption of smaller, less vital system, many of which — such as blogs or social media tools — can have a greater impact on the day-to-day life of average Americans. The availability of cheaper, leaner solutions, now commonplace in the private sector, can not only provide government with a unique opportunity to meet citizens demands to do more with less, but can also empower it to expand service offerings into new verticals such as transacting additional citizen services online, or offering greater transparency within existing offerings. To unlock the potential of emerging technologies, security requirements for pilot programs should be relaxed so that agencies are empowered to follow private sector best practices to rapidly bring prototypical solutions to market and are given the freedom to tap existing third-party, often free services.
I. The Current IT Security Regime
The federal governments current information security policy is defined primarily by two key documents, the Federal Information Security Management Act and the Privacy Act, and is operationalized through a patchwork of recommendations and requirements as outlined by the Federal Acquisition Register, the National Institute of Standards and Technology, and the Office of Management and Budget.
A. The Federal Information Security Management Act
The Federal Information Security Act (FISMA) is designed to provide a comprehensive framework for ensuring the effectiveness of information security controls over information resources that support Federal operations and assets.10 By harmonizing overlapping agency requirements, eliminating obsolete mandates, and updating outmoded provisions, the Act sought to unify congress attempts over the past decade to address information security needs piecemeal through a scattered mosaic of legislation.11 Specifically, the acts purpose is protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction to maintain the integrity, confidentiality, and availability of such information.12 FISMA has several components. First, it designates the Office of Management and Budget (OMB) director as having the authority to oversee agency information security policies and practices, requires agencies to provide information security protections, and oversee agency compliance with FISMA by conducting annual reviews and producing reports to congress.13 To facilitate this end, FISMA empowers OMB with the ability to approve or deny any federal agency’s information security system.14 Second, FISMA requires that the National Institute for Standards and Technology (NIST) promulgate the security standards by which agencies must comply.15 NIST fulfills this obligation by producing Federal Information Processing Standards Publication (FIPS) 199. FIPS 199 specifies guidelines for defining the potential impact of a security breach (low, medium, high) resulting in the loss of confidentiality, integrity, or availability of government information.16 Third, FISMA requires, a certification and accreditation (C&A) — measuring the efforts to define the appropriate risk on a system-by-system basis — and plan of action and milestones (POA&M) — measuring the compliance with established methodologies for correcting discrepancies.17 Both metrics are design to measure the people, process, and technology aspects of security, but arguably fail to properly represent true operational security on a system-by-system basis.18 Finally, the fourth requirement, under FISMA, plans and procedures must ensure continuity of operations [(COOP)] for information systems.19 Such provisions generally include backup systems, transitions plans, and security controls to maintain continuity of operations during loss of power, disasters, or other potential interruptions of service.20 FISMA does not require security at all costs, but instead, includes cost-effectiveness among its considerations, specifying that implementing policies and procedures to cost-effectively reduce risks to an acceptable level.21
B. The Privacy Act
The second major body of security legislation governing agency’s actions is the Privacy Act of 1974.22 The act, which was passed on December 31st 1974 and went into effect in September of the subsequent year, serves the dual purpose of preventing the disclosure of individuals private information by government agencies that collect it (e.g, names or other identifying information linked to education, financial transactions, medial history, and criminal or employment history), and enabling individuals to determine what information has been collected, as well as to verify its accuracy.23 Specifically, as outlined in the E-Government Act of 2002, agencies are required to conduct a Privacy Impact Assessment (PIA) prior to “developing or procuring information technology or initiating a new collection of information in an identifiable form.24 For each new information collection, an agency must publish a PIA which addresses (1) What information is being collected; (2) why the information is being collected; (3) the intended use of the agency information; (4) with whom the information will be shared; (5) what notice or opportunities will be provided to individuals regarding what information is collected and with whom that information will be shared; (6) how the information will be secured, and (7) whether the information collection will create a system of records.25 PIAs are intended to be commensurate with the size of the information system being assessed, the sensitivity of information that is in an identifiable form in that system, and the risk of harm from unauthorized release of that information[.]26 The first step, then, for an agency wishing to deploy information technology, is to determine if such a system constitutes an information system under the act.
The most recent definition of information system comes from the 1995 revision of the Paperwork Reduction Act (PRA).27 The PRA expanded the definition of information system from management information system to any discrete set of information resources organized for the collection, processing, maintenance, use, sharing, dissemination, or disposition of information.28 In this context, information resources include information and related resources, such as personnel, equipment, funds, and information technology.29 At the time of its drafting, information systems roughly corresponded with a physical machine (for example, a computer). These physical boundaries made the task of privacy oversight relatively straightforward.30 It used to be in the 1960s and 1970s that government and large corporations centrally owned computers. With the emergence of the personal computer in the late 1970s and 1980s, this computing power was decentralized as the technology became affordable to the average consumer. This decentralizing trend continued into the 1990s with the explosion of mobile computing. Today however, we are seeing a shift back to a more centralized computing model. As hardware becomes commoditized, computing power is once again being aggregated into large data centers, and becoming accessible much like a public utility.31 This blurring of physical lines — where multiple processes may live on the same physical machine, with many virtual machines being constantly created and destroyed — has the potential to significantly complicate an agency’s determination as to when an information system exists.
The second threshold determination complicated by recent advances in technology is the creation of a system of records. The Privacy Act requires agencies to report the systems of records that they maintain, and for each system, to describe in detail, what information is collected, the purpose of the information collection, and to whom requests for records should be sent.32 Prior to establishing any new system of records, agencies are required to go through a lengthy approval process, and to publish a System of Records Notice (SORN) in the Federal Register.33 The term system of records is an artificial distinction, legislatively created to identify those groups of records to which the Privacy Act applies, and those to which it does not apply.34 The Act itself defines a system of records as a group of any records under the control of any agency from which information is retrieved by the name of the individual or by some identifying number, symbol, or other identifying particular assigned to the individual.35 This analysis involves examining the actual methods a given system uses to store and retrieve information.36
Such an analysis is easy to apply to physical systems of records (for example, records in a doctors office organized by patient name or social security number) or the first computerized databases that required strict, predefined indices to efficiently retrieve a given record. Today however, most methods of storage and retrieval allow filtering records by any composite data element (for example, SQL), and emerging methods of searching large amounts of data (for example, MapReduce, Hadoop) have no index at all. This raises a question of how many retrievals of information using ones name or other unique identifier, even if not via a traditional index, are sufficient to create a system of records.37 Compounding the fact, industry best practices dictate that even the simplest systems — be they public facing systems like a site or internal business process systems to aid in agency functions — assign unique identifiers to each record to aid in retrieval. Clearly, the privacy act has not kept with the reality of how technology is actually used to support an agency’s mission.
C. Formal Guidance
The third source of agency responsibility comes from formal guidance promulgated by OMB and NIST. The Federal Acquisition Regulation (FAR) requires agencies to comply with OMB’s implementing policies including Appendix III of OMB Circular A-130, and guidance and standards from the Department of Commerce’s National Institute of Standards and Technology.38 Circular A-130 requires agencies to incorporate a system security plan within the information resource management (IRM) planning processes as required by the Paperwork Reduction Act. Such a plan must include established rules of behavior regarding each system, adequate training, personnel controls, incident response capabilities, continuity of support, technical security, and written agreements for any interconnections between systems. A-130 also outlines additional, more stringent requirements for major systems.
NIST publishes its requirements as part of Federal Information Processing Standards (FIPS) 199 and Special Publication (SP) 800–53.39 FIPS 199 provides a framework for determining the level of risk associated with the potential impact to an organization or individuals for a given system across three metrics, confidentiality, integrity, and availability. This in turn, informs additional requirements depending on the appropriate security category assigned. SP 800–53 further outlines the various security controls agencies must implement for each system. Such controls include management controls — security assessment and authorization, risk assessment, system and services acquisition, and program management; operational controls — personnel security, physical and environmental security, contingency planning, configuration management, maintenance, systems and information integrity, media protection, incident response, and awareness and training; and technical controls — identification and authorization, access control, audit and accountability, and system and communications protection. Various other publications impose additional continuous monitoring requirements beyond the initiation certification and accreditation of the application.40
II. Mere Security Theater
The myriad guidance and labyrinthian requirements may do less to secure our nations information systems than policymakers hope. A number of critics have complained that the current security regime incentivizes agencies to perform a simple box checking paperwork exercise that does not keep up with an ever-expanding and ever-changing world of threats.41 This paperwork drill puts into place and measures paper-based processes, rather than technical processes, for implementing information security,42 and ultimately fails to address the root causes of network exploitation: inadequate software quality assurance.43 Even assuming 100% compliance with all FISMA requirements, many believe that agencies would fail short of bolstering actual information security.44 Tim Bennett, President of the Cyber Security Industry Alliance lamented that FISMA grades not how well agencies have increased their information security, but rather how well agencies increase their compliance with the FISMA mandated processes.45 The current FISMA reports say absolutely nothing about government security but rather is merely a measure of compliance with report generation.46 FISMA erroneously assumes an agency’s compliance with largely reactionary standards is an objective measure of information security. Instead, such an assumption incentivizes agencies to expend efforts on reporting, rather than securing the underlying security threats the reports represent. 47 By focusing on security audits, rather than the actual security of the systems, FISMA provides a framework for Federal Chief Information Officers (CIOs) to quantify their progress in ways a large, non-technical bureaucracy can easily digest. Rather than drill down on the number or nature of attacks deflected, CIOs could simply report that they have achieved a specified level of FISMA compliance, and thus could justify their various budgets.48 Put another way, FISMA provides little incentive to address any security metric outside those explicitly required of agencies by FISMA.49
This focus on reporting requirements and audits, rather than actual information security may explain why cyber attacks increased some 250% between 2007 and 2009.50 The Community Emergency Response Team (CERT) estimates that 95% of intrusions exploited known software vulnerabilities to which counter measures were readily available.51 Given the fluid nature of the technology industry, and the reliance on cash flow to support operations, software manufacturers face increased pressure to rush their products to market to better capitalize on the product’s innovation.52 This creates a race to the bottom in terms of software quality. Given strict budgets and timelines, agencies our forced to either cut desired functionality or skirt security requirements.
Such a focus on reporting and managerial controls rather than on actual information security manifests itself on a near daily basis. Despite our nations best efforts at complying with codified security policy, thousands of computers on our military networks were infected by malware,53 unauthorized users in Iran were able to gain access to blueprints and other information about a helicopter in the Presidents fleet,54 and the State and Defense departments have lost more than six terabytes of information due to digital espionage, an amount equal to one-sixth of the information contained within the Library of Congress.55 As recently as last year, even, a single Army private was able to steal more than a quarter of a million classified State Department cables and nearly 100,000 intelligence reports.56 This deficiency however, is not limited to threats to national security. It is also personal. The Department of Defense was hit with a $4.9 billion class action suit as a result of the theft of the personal information of some 4.9 million uniform service members and their families,57 the Department of Veterans Affairs lost health records and other sensitive personal information for approximately 26.5 million veterans and their spouses, 58 the Secretary of Defenses unclassified email was hacked,59 and the Navy CIO had his personal information compromised not once or twice, but on six distinct occasions.60
III. Problems with Existing Information Security Policy
Existing security policies fail to properly protect federal information systems. Empirically, agencies are unable to implement the security controls as defined by FISMA. This may be due, at least in part, to the ambiguity that the anachronistic policy is shackled with when faced with modern advances in technology. Even when requirements are clear, in the face of competing priorities, such obligations are often met with indifference by the agency, and unless a major attack brings cyber security to the forefront, such ambivalence is likely to remain within the public as well. Finally, Congress’s overall lack of attention renders accountability a challenge and as a result, compliance suffers further. As a result, the existing security regime may fall short of its promises.
A. Difficulty Implementing
Empirically, agencies struggle to meet the requirements asked of them by existing security policy. To date, none of the 24 major agencies61 have fully implemented the agency-wide information security programs required by FISMA.62 Seven of such agencies described their security as poor, nine as satisfactory, and only six as good.63 It should come as no surprise then, that the House Government Reform Committee rated government-wide FISMA compliance D+.64 In 2004, two years after the act was penned, 23% of federal IT systems lacked the risk assessment required, and one in four lacked the contingency plans necessary to ensure continuity of operations. Of those with plans, a mere 57% had ever been tested. Nine of the 24 agencies did not even have a complete inventory of their IT systems.65 Today, nearly a decade after FISMA was enacted, the numbers name similarly bleak. As the federal IT footprint continues to swell and budgets continue to tighten, in FY 2011, only 33% of agencies reported compliance with FISMA’s risk management requirements, and that same one in three ratio had programs compliant with the contingency planning requirements. Only one in four were complaint with configuration management, POA&M, and identify and access management requirements. Thirteen percent had not even begun to implement continuous monitoring programs, and one in five still lacked an accurate IT inventory.66
One example of agencies difficulty complying can be seen in the Department of Veterans Affairs (VA). In 2007, a hard drive containing nearly 200,000 veterans records went missing from a safe. The Office of the Inspector General concluded that VA’s security plan did not comply with its own rules for securing data, and that the it improperly allowed an IT Specialist trusted with the data access to information beyond his clearance.67 After public disclosure, a lawsuit arose, and when the opinion was issued nearly two years later, the court noted that it had no reason to think that all of the alleged violations have been remedied. Robert T. Howard, Assistant Secretary for Information and Technology, stated before a Senate inquiry that the hard drive theft was a wake up call and that [a]s a result of that incident we began to create the environment needed to better protect the sensitive information entrusted to us.68 Despite the lesson learned, four years after the incident, the VA reported that only 55% of its portable devices had FISMA-mandated encryption.69 A related case involving the Bureau of Indian Affairs (BIA) resulted in an injunction requiring that various IT systems be disconnected from the internet due to inadequate information security. The court noted that the agency’s FISMA compliance had lagged behind the expansion of the department’s internet presence.70 Although anecdotal, both incidents confirm one thing: on the whole federal agencies are unable to adequately implement the requirements imposed on them by FISMA.
B. Ambiguity
Even when agencies strive to comply with their information security obligations, FISMA’s requirements are often drafted so ambiguously as to render knowing what constitutes those obligations impossible.71 In certain situations, for example, agencies are required to ensure the security of federal data maintained on third-party systems. Yet the scope of this obligation, defined as when such systems are operated on behalf of an agency72 are often ambiguous, and the acts legislative history, sheds no additional light on the phrase. The congressional report accompanying FISMA is written largely in broad terms, with platitudes about the importance of information security. The legislative history in fact shows virtually no congressional focus on the nitty-gritty of how the statute is to be implemented.73 Additionally, FISMA proves no mechanism for agencies to clarify such ambiguities. This can result in months (or longer) of inaction as bureaucrats and lawyers struggle to interpret a technical statutory scheme with which they may have little familiarity.74 Perhaps among the most telling signs of FISMA’s troubling ambiguity is the birth of a Beltway industry specializing in FISMA compliance. As one security-consulting firm asserted, Information Security and Privacy regulations are purposely vague to ensure they cover a wide range of organizations over a long period of time without having to be amended by Congress.75 FISMA as currently drafted does not provide adequate guidance for agencies to properly implement its requirements.
C. A Culture of Indifference
Compounding the difficulties surrounding implementation is a cloud of indifference. Most agency employees likely view FISMA’s requirements as unsexy. Workers are more motivated and more productive, and thus more effective when they feel as if the task at hand is worthwhile.76 Yet an agency’s information security activities have no direct connection to its substantive policy goals. It is not a stretch to think agency employees may lack the sense that operationalizing the requirements of FISMA is important or particularly urgent. Where bureaucrats are unmotivated, inertia can easily overpower the impetus to make costly changes as agency employees seek to minimize the amount of FISMA work they must do.77 Put another way, without clear alignment with policy goals, implementation of FISMA may be destined to remain good enough for government work.
The court in the BIA case noted earlier78 described a bureaucratic culture marked by indifference, confusion, and lack of accountability. In one passage, the court discussed a stunning lack of management and oversight in the context of the departmental IT security program. While technicians were aware of the security vulnerabilities, they made no effort to fix it. The court continued, Interior’s IT security planners have discussed [the necessary fix] only in concept [One DOI senior official testified that he was] not aware that we’ve actually put it down into a formalized, written plan.79 Such indifference not only threatens FISMA implementation, but also can render our nations information systems less secure.
D. A Disinterested Public
This indifference towards FISMA however, does not remain solely within agency walls. While an enormous security breach could trigger widespread public outrage — for example if individuals tax filings were compromised — it is likely the issue would come to the forefront of taxpayers minds. Absent such a breach however, public interest is likely to remain low.80 This lack of public interest can create a complacency culture within Congress as well. It is difficult to imagine that members of congress receive many constituent inquiries into the pace of FISMA adoption, or believe that their ability to seek reelection is tied closely with the day-to-day success of agency CIOs. While citizens may cast ballots in favor of any issue they choose, FISMA is unlikely to come out of the periphery and receive enough public attention to strongly influence congressional elections, and thus one can assume that congressional oversight of information security is low on both Congress and the public’s list of priorities.81
E. Lack of Accountability
Between employee indifference, public apathy, and general congressional inattentiveness, agencies compliance with FISMA mandated requirements (or lack thereof) receives far less scrutiny than work considered core to the agency’s mission. As such, deficiencies may go underappreciated or wholly unnoticed.82 Even when the OMB reporting process uncovers such failures, it is unlikely that consequences would result. [T]he nature of the stick is so draconian and counterproductive to agency effectiveness that it is hard to imagine OMB ever fully imposing it. Federal IT has led to vast improvements in the way government agencies deliver citizen services and perform day-to-day operations, and to punish an agency by retarding its ability to take advantage of such advancements seems unlikely. Further, given the sheer number of delinquent agencies, bureaucrats may be able to take solace in a safety in number mentality. While it may be unlikely that OMB would withhold funding from one or two deviant bureaus, the possibility of OMB mass sanctions would be both politically inconceivable and functionally disastrous, effectively neutralizing any threat implicit within OMB’s oversight role.83
IV. The Toll on Taxpayers
Such a heavyweight information security policy affects taxpayers in two ways. First, it imparts a direct administrative cost as agencies divert man-hours from supporting agency mission to supporting the significant administrative burden. Second, it imparts an indirect cost, as agencies struggle to operationalize new technologies and approaches to delivering citizen services and transacting the business of the nation.
A. Administrative Costs
As noted earlier, FISMA and its associated implementation guidance impose a significant administrative burden on agencies. For example, assume for a moment a federal agency would like to establish a simple blog for a newly announced short-term initiative. The blog would have the ability for agency staff members to post updates on the initiative on a regular basis, would provide members of the public with the opportunity to post comments, a fairly standard practice among similar sites. In the private sector, even a non-technical content creator or subject-matter expert could simply navigate to one of the many companies that provide blog hosting as a service, enter a purchase card number, and have an entire site standing in a matter of minutes. In the public sector however, such is not the case.
The program office initiating the request would have to contact the chief security officer to begin the FIPS-199 mandated process of determining the FISMA risk level, which in this hypothetical, after some work, we can assume will be low given the non-essential nature of the site. Next, the software and the facilities that host it will be required to undergo a C&A process, even if another agencies has already done the same. This process may require a significant investment of time, if for example, the hosting company has not previously undergone an evasive third-party security audit, and may even require the agency to host the software within its own datacenter (thus adding to the overall cost), if the hosting company, presumably a small technology startup, is unwilling to do so. Once a primary datacenter and application is certified, COOP may also require a fallback, parallel site be concurrently established on a separate system (potentially doubling the cost imposed).
Once technical requirements are met, significant legal requirements must be met as well. If, as in this hypothetical, the agency wishes to allow citizens who provide their name the opportunity to respond to each post (much like Facebook or most blogs on the internet today), the agency will most likely be required to conduct a PIA, including the 30 and 60 day comment periods, receive OMB approval for the information collection under the PRA, and publish a SORN if comments are to be retrieved by the commenters name as is generally the case with blogs.
Finally, the agency must establish a formal security plan and administrative controls. The security plan would outline rules of behavior, training, personnel controls, incident response, continuity of support, technical security, and if externally hosted as in the above example, an interconnection agreement with the third party hosting company. Last, the agency would have to design and fully implement SP 800 controls including management controls — security assessment and authorization, risk assessment, system and services acquisition, and program management; operational controls — personnel security, physical and environmental security, contingency planning, configuration management, maintenance, systems and information integrity, media protection, incident response, and awareness and training; and technical controls — identification and authorization, access control, audit and accountability, and system and communications protection.
To fully comply with both the letter and spirit of current security requirements imposes a significant administrative cost on agencies. While such a heavy-handed procedure is proportional for our nations mission critical systems, a tension arises when that same procedure is applied to today’s widely consumerized technologies.
B. Opportunity Costs
Beyond the direct, administrative costs imposed by the governments existing security policy, there is also a hidden opportunity cost. Imagine, as in the above example, an agency looking to engage the public around a short, three-month policy initiative. Because the administrative overhead required by existing security policy may impose a three- to six-month lead-time on the agency before it could publish its first word, it is likely to forego the undertaking entirely. Such lost opportunities, however, on not simply limited to blogs. As the industry increasingly moves to a hosted service provider model (infrastructure as a service, platform as a service, and software as a service), such conflicts are going to become increasingly common, be they innovative new social networks, such as the code sharing service GitHub, or cloud-based business tools such as collaboration suite Basecamp or customer relationship manager Sales Force.
Part of the private sectors success and recent explosion of web-based startups can be attributed, at least in part, to cloud computing’s ease of deployment and an emerging system oriented architecture. A single web developer can programmatically create and destroy servers on demand to meet scalability requirements, or simply to rapidly prototype a new application. At the same time, service providers are increasingly exposing their underlying data as application programming interfaces (APIs), allowing developers to loosely connect systems to rapidly bring products to market. Government developers however, are, without extensive administrative costs, precluded from fully taking advantage of these advances, and thus, the means by which citizens services are transacted, are often generations behind their private sector counterparts.
V. Streamlining our Nations Information Security
The nations security policy is not inherently flawed, and for many, mission-critical IT systems, taxpayers should demand no less. For many, smaller, citizen-facing systems, however, a streamlined authorization process, could more rapidly deliver smarter, better tools with which government agency’s can transact the nations business, and do so in a secure and efficient manner.
A. Carrots, not Sticks
OMB should seek to enforce FISMA compliance through incentives, rather than punishments. Agencies with exemplary security records should qualify for additional funding, resource, and personnel to induce others to follow their lead.84 Bureaucrats, by their nature, are hungry for additional funding. A concept analogously well established in the private sector, the prospect of greater resources at an agency CIOs disposal is a far greater motivator than simply seeking to avoid reprimand from OMB.85 Such incentives would not only encourage compliance with existing requirements, but would encourage CIOs to think critically about their security posture in hopes of a resource award, thus ameliorating the threat of mere box checking.
B. Reduce Duplication of Efforts
Certifications and other collateral should be fully transferable between agencies. This would have two implications. First, when multiple agencies utilized a single information resource, OMB could determine which agency would bear the responsibility for ensuring FISMA compliance, saving the other agency (and taxpayers) significant time and effort.86 Second, an application certified by one agency, should be fully transferable to another agency. In the blog example used earlier, if one agency had gone through the certification process for the service provider, another agency would be able to piggyback on their efforts. Such an arrangement may require OMB to curate a collaborative commons of shared certifications and records made available to agencies.
B. Modular Administration
OMB should create guidance to decouple the various components that make up an information system from the administrative assets that must accompany them. An agency site, for example, might consist of the cloud service provider, the generic software image from which the server is based, and the content management system which generates the output visitors see (collectively, often referred to as the technology stack). As many of these components are often reused between information systems (for example, another site may share the same server image, or database or other application may share the same datacenter), if agencies could analyze each component independently, in a modular fashion, it could realize a significant savings in the long run, especially if such certifications could be shared among agencies.
C. A Grace Period for Pilot Programs
Last, and most importantly, congress should carve a pilot program exception within existing IT security requirements. Given the pace at which consumer technology advances, if agencies wish to adopt new, innovative services to improve non-mission critical lines of business, it should be permitted do so with limited administrative overhead. Even with a rigid adherence to industry best practices, given such a grace period, agencies could rapidly prototype and onboard new applications and services to better the delivery of citizen services, and if successful, can insure FISMA compliance in tandem with the ongoing pilot.
Conclusion
Congress has imposed on federal executive agencies an onerous system to ensure information security benchmarks are met. Despite OMB’s best efforts, however, compliance is low, and successful attacks continue. Agencies are finding it difficult to complete the myriad requirements that become increasingly ambiguous in sight of an ever-changing technology landscape. This may be due, at least in part, to an indifference endemic of both agency implementers and the public. Such requirements, however, do not come without a cost. FISMA creates significant administrative overhead for agencies looking to innovate, and in many cases, may do so to such an extent so as to retard or otherwise prevent adoption. As a result, some argue, that by inhibiting such innovation, today’s federal security policies simply secure nothing more than the status quo. As communications technology becomes increasingly consumerized, the opportunity for federal agencies to do more with less, and to streamline the delivery of, or expand into new citizen services has never been more apparent. Yet at the same time, public sector adoption is increasingly falling behind private sector counterparts. Instead, OMB should seek to incentivize those agencies that best secure federal information assets and seek out innovative, secure solutions to transacting the nations business. Existing requirements can be streamlined, such as breaking security analyses into the disparate technology components they represent, and allowing such modules of certification to be shared among systems and agencies. Finally, a formal grace period, for low risk, high impact, citizen-facing systems can usher in a new era of transparency and collaborative democracy yet unimagined. Our nations information is one of its chief resources, and great care should be taken to secure it, just as we secure our territories and tangible interests. Such security, however, and the overhead required to implement it, should be proportionate to the risk involved, and should secure the information systems of tomorrow, not simple the status quo.
-
Ctr. for Strategic & Int’l Studies, Securing Cyberspace for the 44th Presidency 56 (2008), available at https://csis.org/files/media/csis/pubs/081208_securingcyberspace_44.PDF ↩
-
Chief Financial Officers Act of 1990,
Pub.L.
101–576, Nov. 15, 1990, 104 Stat. 2838. ↩ -
Office of Management and Budget Fiscal Year 2011 Report to Congress on the Implementation of the Federal Information Security Act of 2002, March 7, 2012, available at http://1.usa.gov/yiyFBb. ↩
-
White House Office of the Press Secretary, Remarks by the President on Securing Our Nation’s Cyber Infrastructure (May 29, 2009), http://1.usa.gov/gvW7VM. ↩
-
Cybersecurity: Next Steps To Protect Our Critical Infrastructure: Hearing Before the S. Comm. on Commerce, Science & Transportation, 111th Cong. (Feb. 23, 2010) (statement of Sen. Rockefeller) (A major cyberattack could shut down our nation’s most critical infrastructure….), http://1.usa.gov/TLlgEh; Senate Comm. on Commerce, Science & Transportation, Press Release, Rockefeller and Snowe Gain Momentum for Landmark Cybersecurity Act (Mar. 24, 2010) (statement of Sen. Snowe) (cyber intrusions and attacks represent both a potential national security and economic catastrophe),
https://www.commerce.senate.gov/public/index.cfm/pressreleases?ID=3A0945BB-D5D8–47F4-A86C-2F71F15892BD
. ↩ -
Marshall, Panetta Discusses Security Challenges in Stratcom Visit, American Forces Press Service, Aug. 5, 2011, http://archive.defense.gov/news/newsarticle.aspx?id=64946. ↩
-
Rethinking FISMA and Federal Information Security Policy, 81 N.Y.U. L. Rev. 1844, 1846 (2006), citing Catriona Davies, US Army Computers Shut Down by Hacker, Daily Telegraph (London), July 28, 2005, at 11 (internal quotation marks omitted). ↩
-
Ukman, Jason, 24,000 Pentagon files stolen in major cyber breach, officials say, Washington Post, July 14, 2011, available at http://wapo.st/o5wKnu. ↩
-
Keneally, Meghan, Chinese government hacks into White House office in charge of the nuclear launch codes, Daily Maily, October 1st, 2012, available at http://bit.ly/SSZJgH. ↩
-
44 U.S.C.A. 3541(1). ↩
-
Robert Silvers, Rethinking FISMA and Federal Information Security Policy, 81 N.Y.U. L. Rev. 1844, 1847–48 (2006), citing H.R. Rep. No. 107–787, pt.1, at 54 (2002), as reprinted in 2002 U.S.C.C.A.N. 1880, 1889 (noting that FISMA consolidates the Government Information Security Reform Act, Pub. L. No. 106–398, sec. 1061–65, 3531–36, 114 Stat. 1654A, 266–75 (2000), the Information Technology Management Reform (Clinger-Cohen) Act of 1996, Pub. L. No. 104–106, 5001–02, 110 Stat. 679, 679–80, the Computer Security Act,Pub. L. No. 100–235, 101 Stat. 1724 (1988), and the Paperwork Reduction Act of 1980, Pub. L. No. 96–511, 94 Stat. 2812). ↩
-
44 U.S.C.A 3542(b)(1); see also FAR 2.101(b); 70 Fed. Reg. 57449, 57451 (Sept. 30, 2005). ↩
-
44 U.S.C. 3543(a). ↩
-
ID. 3543(a)(5). ↩
-
44 U.S.C.A. 3543(a). ↩
-
Federal Information Processing Standards Publication 199, Standards for Security Categorization of Federal Information Systems 1, 2 (Feb. 2004), available at http://csrc.nist.gov/publications/fips/fips199/FIPS-PUB-199-final.PDF. FAR 11.102 and 11.201 include references to the FIPS PUB standards. ↩
-
Daniel M. White, The Federal Information Security Management Act of 2002: A Potemkin Village, 79 Fordham L. Rev. 369, 380–81 (2010) ↩
-
Id., citing Arthur Conklin, Why FISMA Falls Short: The Need for Security Metrics, 41 Wireless internet S. Provider Proc. 1, 1–8 (2008); see also Agencies in Peril (internal citations omitted). ↩
-
44 U.S.C.A. 3544(b)(8). ↩
-
See generally 12–3 Briefing Papers 1, 12–3 Briefing Papers 1, 11–12. ↩
-
44 U.S.C.A. 3544(a)(2)©; see also 44 U.S.C.A. 3544(b)(2)(B). NIST Special Publication 800–53A, Rev. 1, Guide for Assessing the Security Controls in Federal Information Systems and Organizations 3, 1.1 (June 2010) (Organizations have the inherent flexibility to determine the level of effort needed for a particular assessment This determination is made on the basis of what will accomplish the assessment objectives in the most cost-effective manner and with sufficient confidence to support the subsequent determination of the resulting mission or business risk.), available at http://1.usa.gov/aDXfog. ↩
-
5 U.S.C.A 552a. ↩
-
ID. 552a(b), (d), (f). ↩
-
44 U.S.C.A. 3601 et seq. ↩
-
ID. See generally, Shahid Khan, "Apps.gov": Assessing Privacy in the Cloud Computing Era, 11 N.C.J.L. & Tech. On. 259, 272 (2010). ↩
-
44 U.S.C.A. 3601 at 208(b)(2)(b)(i), 116 Stat. 2922. ↩
-
44 U.S.C.A. 3501. ↩
-
44 U.S.C.A. 3502(14) (1994), amended by Paperwork Reduction Act of 1995, Pub. L. No. 104–13, 3502(8), 109 Stat. 163, 166. ↩
-
ID. at 3502(6). ↩
-
See Shahid Khan, "Apps.gov": Assessing Privacy in the Cloud Computing Era, 11 N.C.J.L. & Tech. On. 259, 273 (2010) ↩
-
Shahid Khan, "Apps.gov": Assessing Privacy in the Cloud Computing Era, 11 N.C.J.L. & Tech. On. 259, 273–74 (2010) ↩
-
5 U.S.C.A. 552a(e) (2000). ↩
-
About Privacy Act Issuances, Government Printing Office, available at http://1.usa.gov/QSzFBE. ↩
-
Julianne M. Sullivan, Will the Privacy Act of 1974 Still Hold Up in 2004? How Advancing Technology Has Created A Need for Change in the “System of Records” Analysis, 39 Cal. W. L. Rev. 395, 399 (2003) (adding The definition provided by the Privacy Act is not based on the ordinary, plain meaning of the words system of records, but is in fact a very specific type of system, with very particular rules. This distinction likely arose out of the need to create some kind of distinction between groups of records that should be accessible and those that should not.) ↩
-
5 U.S.C.A. 552(a)(5) (2000). ↩
-
Henke v. Dept of Commerce, 83 F.3D 1553, 1459–60 (D.C. Cir. 1996). ↩
-
See Sullivan, citing Henke, 83 F.3D at 1461 (noting that when records are compiled for investigatory purposes, even a few retrievals might be sufficient to create a system of records). ↩
-
FAR 7.103(w). See generally Appendix III to OMB Circular No. A-130, Office of Management and Budget, available at http://1.usa.gov/RlibPS. ↩
-
FIPS PUB 199, Standards for Security Categorization of Federal Information and Information Systems 1, 2 (Feb. 2004), http://csrc.nist.gov/publications/fips/fips199/FIPS-PUB-199-final.PDF; NIST Special Publication 800–53, Rev. 3, Recommended Security Controls for Federal Information Systems and Organizations 6, 2.1 (May 2010), http://1.usa.gov/9DDoih. ↩
-
See, for example, OMB Memorandum 11–33, FY 2011 Reporting Instructions for the Federal Information Security Management Act and Agency Privacy Management (Sept. 14, 2011) (enclosing DHS Memorandum FISM 11–02 (Aug. 24, 2011)), http://1.usa.gov/oCB4it. ↩
-
More Security, Less Waste: What Makes Sense for Our Federal Cyber Defense: Hearing Before the Subcomm. on Federal Financial Management, Government Information, Federal Services & International Security, 111th Cong. (Oct. 29, 2009) (statement of Sen. McCain), http://1.usa.gov/Ut4XuK. ↩
-
GovWin Says FISMA Fails to Improve Overall Security, March 16, 2006, http://bit.ly/UxOqHX. ↩
-
Daniel M. White, The Federal Information Security Management Act of 2002: A Potemkin Village, 79 Fordham L. Rev. 369, 372 (2010) (internal citations omitted). ↩
-
See Wm. Arthur Conklin, Why FISMA Falls Short: The Need for Security Metrics, 41 Wireless internet S. Provider Proc. 1, 1–8 (2008); see also Agencies in Peril: Are We Doing Enough to Protect Federal IT and Secure Sensitive Information?: Hearing Before the S. Subcomm. on Fed. Fin. Mgmt., Gov’t, Info., Fed. Servs., and Int’l Sec., 110th Cong. 1 (2008) at 2–6 (statement of Tim Bennett, President of Cyber Security Industry Alliance) (identifying general flaws in FISMA reporting); Angela Gunn, Fed Having Fits over FISMA and Cybersecurity, Betanews (Dec. 12, 2008), http://bit.ly/X16RFd. See generally Daniel M. White, The Federal Information Security Management Act of 2002: A Potemkin Village, 79 Fordham L. Rev. 369, 380 (2010). ↩
-
Id. ↩
-
Vijayan, Jaikumar, Critics question value of federal IT security report card, Computerworld, May 21st, 2008, available at http://www.networkworld.com/article/2279783/lan-wan/critics-question-value-of-federal-it-security-report-card.HTML (noting some agencies that are making an effort to comply with the true intent of the 396-page FISMA requirements document are getting poor grades on the annual report card, while others that have treated the process as a mere paperwork exercise are getting good grades.) (Internal quotation marks omitted). ↩
-
White, 79 Fordham L. Rev 369, 382; ID. (commenting First, Congress creates waste by writing FISMA in a way that demands useless reporting, and then it highlights the useless scores in a way that in some cases provides incentives for federal agencies to deliver misleading results.). ↩
-
White, 79 Fordham L. Rev 369, 381–82. ↩
-
ID. at 381. ↩
-
Gregg Carlstrom, Net Attacks Triple in 2 Years, Fed. Times (Aug. 3, 2009). (a conservative estimated since many agencies under report by as much as 50%, and since the statistic excludes the Department of Defense which receives millions of scans and probes each year.); See also Conklin, 41 Wireless internet S. Provider Proc. 1 (The recent spate of highly publicized information security failures in Federal agencies highlight the limitations of the current FISMA based approach…. The fact that … some agencies have not had an information security failure[s may be due to] lack of knowledge.). See generally ID. at 382. ↩
-
Kevin R. Pinkney, Putting Blame Where Blame Is Due: Software Manufacturer and Customer Liability for Security-Related Software Failure, 13 Alb. L.J. Sci. & Tech. 43, 66 (2002). ↩
-
White, 79 Fordham L. Rev 369, 384. ↩
-
White House Office of the Press Secretary, Remarks by the President on Securing Our Nation’s Cyber Infrastructure (May 29, 2009), http://1.usa.gov/gvW7VM (In one of the most serious cyber incidents to date against our military networks, several thousand computers were infected [in 2008] by malicious software–malware.). ↩
-
Source in Iran Sees Plans for President’s Chopper, USA Today, Mar. 2, 2009 (The U.S. Navy is investigating how an unauthorized user in Iran gained online access to blueprints and other information about a helicopter in President Obama’s fleet.) ↩
-
Cybersecurity: Assessing the Nation’s Ability To Address the Growing Cyber Threat: Hearing Before the H. Comm. on Oversight & Government Reform, 112th Cong. (July 7, 2011) (statement of Rep. Issa), http://1.usa.gov/T2MnvZ. ↩
-
Information Sharing in the Era of WikiLeaks: Balancing Security and Collaboration: Hearing Before the S. Comm. on Homeland Security & Government Affairs, 112th Cong. (Mar. 10, 2011) (statement of Sen. Collins), http://1.usa.gov/TMnAuL. ↩
-
Kime, DOD Hit With Lawsuit Over Lost Tricare Data, ArmyTimes, Oct. 13, 2011. ↩
-
S. Rep. No. 111–110, at 3 (Dec. 17, 2009). ↩
-
Cybersecurity: Assessing Our Vulnerabilities and Developing an Effective Response: Hearing Before the S. Comm. on Commerce, Science & Transportation, 111th Cong. 8 (Mar. 19, 2009) (statement of Dr. James Lewis), http://1.usa.gov/QT2gXm. ↩
-
Chabrow, Navy CIO’s PII Exposed for Sixth Time, Gov’t Info. Sec. News, Jan. 4, 2010, http://bit.ly/QT2nlU. ↩
-
As outlined in the Chief Financial Officer Act. ↩
-
No Computer System Left Behind: A Review of the 2005 Federal Computer Security Scorecards Before the H. Comm. on Government Reform, 109th Cong. 32 (2006) (statement of Gregory C. Wilshusen, Director, Information Security Issues, United States Government Accountability Office). ↩
-
Robert Silvers, Rethinking FISMA and Federal Information Security Policy, 81 N.Y.U. L. Rev. 1844, 1850 (2006). ↩
-
Id., citing House Comm. on Gov’t Reform, 109th Cong., Computer Security Report Card 1 (2006). ↩
-
OMB 2004 FISMA Report. See Generally Silvers, 81 N.Y.U. L. Rev. at 1850. ↩
-
OMB 2011 FISMA Report, Table A, available at http://1.usa.gov/yiyFBb. ↩
-
Fanin v. U.S. Dep’t of Veterans Affairs, 572 F.3D 868, 870–71 (11th Cir. 2009). ↩
-
Agencies in Peril: Are We Doing Enough to Protect Federal IT and Secure Sensitive Information?: Hearing Before the S. Subcomm. on Fed. Fin. Mgmt., Gov’t, Info., Fed. Servs., and Int’l Sec., 110th Cong. 1 (2008). ↩
-
OMB 2011 FISMA Report, Figure 8. ↩
-
Cobell v. Norton, 394 F. Supp. 2d 164 (D.D.C. 2005). See generally Silvers, 81 N.Y.U. L. Rev. 1844, 1849–63; White, 79 Fordham L. Rev. 369, 378. ↩
-
See Silvers, 81 N.Y.U. L. Rev. 1844, 1853. ↩
-
In addition, contractors and third-party service providers are implicated under 44 U.S.C.A. 3544(a)-(b) (including information systems provided or managed by contractor, or other source), and DHS Memorandum FISM 11–02 as enclosed in OMB Memorandum M-11–33, Reporting Instructions for the Federal Information Security Management Act and Agency Privacy Management (DHS identified… contractors most likely to be subject to FISMA requirements [including] Service providers–e.g… managed services, like subscriptions to software services.). ↩
-
ID., citing H.R. Rep. No. 107–787, pt.1, at 76–88 (2002). ↩
-
ID. at 1853. ↩
-
NetIQ, NetIQ FISMA Compliance & Risk Management Solutions 2 (2005), available at http://www.fedtek.com/wp-content/uploads/2010/05/fisma_broch.PDF. See generally White, 79 Fordham L. Rev. 369, 405. ↩
-
Silvers, 81 N.Y.U. L. Rev. 1844, 1859–60 (citing L.L. Cummings & Donald P. Schwab, Performance in Organizations: Determinants and Appraisals 90–101 (1973)). ↩
-
Id. ↩
-
Supra footnote 69 et seq. ↩
-
Silvers, 81 N.Y.U. L. Rev. 1844, 1852 (2006) (citing 394 F. Supp. 2d at 261 (quoting trial testimony of W. Hord Tipton, Chief Information Officer, DOI)). ↩
-
ID. at 1860–61. ↩
-
ID. See Jack M. Beermann, Essay, Administrative Failure and Local Democracy: The Politics of DeShaney, 1990 Duke L.J. 1078, 1105 (Administrative failures may be so low on the political agenda that they will not even be addressed in the electoral process.). ↩
-
ID. at 1862, See Beermann, supra note 79, at 1106 (Unelected agents are shielded from direct political scrutiny. Thus, given the difficulty of effective oversight, agency actions may not be brought into line with legislatively stated goals.). ↩
-
Silver, 81 N.Y.U. L. Rev. at 1862. ↩
-
See generally Silvers, 81 N.Y.U. L. Rev. 1844 at 1868. ↩
-
ID. (citing Randal O’Toole, Reforming the Forest Service 104 (1988) (For top managers, larger budgets mean greater prestige. For middle managers, larger budgets mean more people on their staff, and this generally provides them with higher salaries. For lower managers, larger budgets mean greater opportunities for advancement.)). ↩
-
ID. at 1871. ↩
Ben Balter is the Director of Hubber Enablement within the Office of the COO at GitHub, the world’s largest software development platform, ensuring all Hubbers can do their best (remote) work. Previously, he served as the Director of Technical Business Operations, and as Chief of Staff for Security, he managed the office of the Chief Security Officer, improving overall business effectiveness of the Security organization through portfolio management, strategy, planning, culture, and values. As a Staff Technical Program manager for Enterprise and Compliance, Ben managed GitHub’s on-premises and SaaS enterprise offerings, and as the Senior Product Manager overseeing the platform’s Trust and Safety efforts, Ben shipped more than 500 features in support of community management, privacy, compliance, content moderation, product security, platform health, and open source workflows to ensure the GitHub community and platform remained safe, secure, and welcoming for all software developers. Before joining GitHub’s Product team, Ben served as GitHub’s Government Evangelist, leading the efforts to encourage more than 2,000 government organizations across 75 countries to adopt open source philosophies for code, data, and policy development. More about the author →
This page is open source. Please help improve it.
Edit