When it comes to saying “no” to innovation in government, there’s the law, and then there’s the myth that surrounds it. More often than not, that myth serves not to further insulate the agency from potential threats to its infrastructure or mission, but instead, the myth is used to place a never-ending litany of bureaucratic obstacles in front of those who champion change. Much of that myth — long-ago calcified into organizational policy — stems from government policymakers’ decade-old superstitions about certain technologies:
2005 called, they want their worldview back
Today, there’s a mainframe-sized chasm between the basic assumptions that the private sector makes about software development, and the ones that government makes. Ten years ago, building an app in Ruby on Rails would have been a risky proposition given the framework’s maturity, while something like ColdFusion would have been a safe, enterprise-grade bet. Today, the exact opposite is the case, at least for organizations outside the Beltway.
You’d be hard pressed to find a private-sector firm today worth its venture-capital investment that isn’t built on open source software or that doesn’t at least have a keen understanding of the unmatched value created by their development teams working more openly. That’s far from the case in government. Even if you’re able to strip away those anachronistic technical assumptions, there are still vast educational, organizational, and cultural hurdles for an agency to overcome before they can even think about their first commit.
Management by FUD
In government, there’s the law, and then there’s the much larger myth that surrounds it and more often it’s the myth that gets in the way of progress (although the law’s usually blamed). Just because something has been done a certain way, doesn’t mean that it must be done that way, or that that’s the only way to do it. Greg Godbout from 18F put it best when he said “there’s a tendency to think that your habit is a policy or a law.”
It’s not malicious, but, given the demographics of government IT, new ideas like open source can feel both personally and professionally threatening. To take an example from just a few years back, the government employee who spent a decade securing funding and headcount to grow the fiefdom that is “his” or “her” datacenter may not immediately see what role they can play in a world where their agency is exclusively in the cloud. In this model, it doesn’t matter what the technology is. The indictment is always the same.
At each agency the most valuable tool to combat that FUD — fear, uncertainty, and doubt — is not to ask “may I?”, but rather, to get directly to the ground truth of what exactly needs to be done to get relevant stakeholders to “yes” (and who exactly those stakeholders are). Using X service is going to violate agency policy? Do you have a copy of that policy I can read? Can you give me a checklist of everything we need to do to be in compliance with that policy? When was it last updated? Who can we talk to about getting that policy changed?
Incentivizing the risk-averse
Government is often accused of harboring a CYA culture. Trying what’s perceived as new and risky (whether it is or not) may land you in hot water, but absent the profit motive regularly found in the private sector, there’s little incentive to offset that risk or do anything other than what’s already been done. Put differently, a bureaucracy’s primary goal is to ensure its own survival, and through that lens, individual changes are categorically perceived as a threat. As a result, you see two things:
Bureaucrats with the agency’s immune system have an incentive to create additional hoops for your shiny new app to jump through. After all, there’s no harm (we don’t have a cost/benefit or return on investment analysis to rely on), and if that hoop isn’t jumped through, it can come back to haunt me if things go south.
There’s a strong pack mentality. If you can point to someone else that’s done it, someone you want to be like, that’s often your best argument. The White House and CIA uses X? I guess it’s good enough for us then. High visibility, quick wins redefine what it means to be status quo, and by definition, what it means to be risk-averse.
The next time you advocate for change within government, keep in mind that approval is not necessarily a matter of sacrificing a certain amount of paperwork to the bureaucratic powers that be (it will never be enough), but rather, that it’s a matter of showing the agency’s gatekeepers the value of the new (especially as it relates to their own role within the organization) and the risk of old (especially if what they see as new, isn’t really all that new at all).
Sometimes, the riskiest thing an organization can do is to buy into its own mythology. It’s been said before, but it’s worth saying again: never take “no” from someone who can’t say “yes”.
Named one of the top 25 most influential people in government and technology and Fed50’s Disruptor of the Year, described by the US Chief Technology Officer as one of “the baddest of the badass innovators,” and winner of the Open Source People’s Choice Award, Ben Balter is a Product Manager at GitHub, the world’s largest software development network, where he oversees the company’s community and safety efforts. Previously, Ben served as GitHub’s Government Evangelist, leading the efforts to encourage government at all levels to adopt open source philosophies for code, for data, and for policy development. More about the author →