If you’ve been around software long enough, there’s a good chance that you’ve heard the term “bikeshedding*.” Bikeshedding is the idea that given the opportunity, people will comment on a proposed idea, and that they will comment on the parts of the idea that they understand (not necessarily the parts that require the greatest scrutiny).
For example, if you propose building a bike shed, those interested in seeing the bikeshed built will dig their heels in on what color you should paint the shed, not its structure, size, or location. The idea traces back to a post on a Linux mailing list nearly two decades ago:
Just because you are capable of building a bikeshed does not mean you should stop others from building one… because you do not like the color they plan to paint it… you need not argue about every little feature just because you know enough to do so.
Parkinson’s law of triviality
Bikeshedding is an application of the broader Parkinson’s law of triviality, the idea that “members of an organization give disproportionate weight to trivial issues”. For example:
[It’s easy to imagine’] a fictional committee whose job was to approve the plans for a nuclear power plant spending the majority of its time on discussions about relatively minor but easy-to-grasp issues, such as what materials to use for the staff bike shed, while neglecting the proposed design of the plant itself, which is far more important but also a far more difficult and complex task.
You’ve likely seen it before when presenting to a committee or to higher-ups. You spend months putting together the proposal, and when you ask for questions at the end of your presentation, they fixate on one or two minor, non-substantive details.
Part of this stems from Parkinson’s law, but part of this also stems from corporate social norms. If you spend 45 minutes giving a presentation, ask for feedback, and those to whom you were presenting have nothing to say, others might think they weren’t listening (or worse, have nothing to add).
If you’ve been around software long enough, there’s also a good chance that you’ve heard the term “honeypot”. Honeypots come in many forms from Tom Clancy novels to real-life KGB operations, but in computer security, a honeypot is essentially an electronic sting operation.
A honeypot is something that looks attractive (and real), but is ultimately fictional and designed to ensnarl the attacker. You might, for example, add a field to a form that’s only visible to computers, in hopes of detecting automated submissions, or add an easily hackable but fake server to a network, in hopes of detecting intruders.
Putting the two concepts together, if you know those to whom you’re presenting need to say something, and know the type of thing they’re most likely to say, why not then provide them with deliberate, low-risk opportunities to contribute?
An old idea with a new name
The idea of a “bikeshed honeypot” (as I’m calling it) isn’t new. The earliest reference I can find is an MSDN blog from 2005 where it’s called “The Admiral’s Pipe”, the idea being that Russian submariners would leave one fairly obvious pipe uncleaned prior to their admiral’s regular inspection. “The admiral would find this pipe, insist that it be cleaned and feel satisfied that he found a problem and not dig any deeper”, as one Microsoft manager put it.
The idea resurfaced again in 2010 on a now defunct StackOverflow thread (made popular by Coding Horror in 2012) as “a duck”, the legend of a game developer who perfectly animated the game’s queen character, then unceremoniously added an easily removable pet duck to every one of her scenes for the producer to criticize (and ultimately remove).
Heck, there’s even a Dilbert comic about it (with a unique twist):
Bikeshed honeypots in the corporate world
A bikeshed honeypot is a mistake, obvious flaw, or trivial decision that you purposely place (or leave) in a proposal in hopes that those to whom you’re seeking input will fixate on the honeypot (“bikeshed”), diverting their attention from the aspects of your proposal that you feel most strongly about. It’s the inflatable tanks on D-Day. It’s a magician’s carefully rehearsed misdirection.
For example, you might propose a new pricing scheme, but make the price point $20.00 instead of $19.99 or suggest a new menu item with a facially uncreative name, in both cases, allowing others to come in and “save the day.” The trick is to make it subtle enough that the commenter feels clever and satisfied pointing it out, but not so subtle that it’s overlooked entirely (or so substantial that it can’t be changed if they don’t).
I can’t in good conscious advocate that you actually use this tactic (or say that I’ve used it). You’re essentially trolling your audience — saying something in hopes of provoking a specific response, with potentially disastrous consequences. That said, if you’ve been around corporate culture long enough, there’s also a good chance you’ve seen it before, and now at least have a name for it (or three).
Prior to GitHub, Ben was a member of the inaugural class of Presidential Innovation Fellows where he served as entrepreneur in residence reimagining the role of technology in brokering the relationship between citizens and government. Ben has also served as a Fellow in the Office of the US Chief Information Officer within the Executive Office of the President where he was instrumental in drafting the President’s Digital Strategy and Open Data Policy, on the SoftWare Automation and Technology (SWAT) Team, the White House’s first and only agile development team, and as a New Media Fellow, in the Federal Communications Commission’s Office of the Managing Director. His paper, Towards a More Agile Government was published in the Public Contract Law Journal, arguing that Federal IT Procurement should be more amenable to modern, agile development methods. More about the author →